Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Never been more underwhelmed by an Apple announcment...Except for the iPhone 4S...........or the 5.......Or the 5S.

If it were 3 or 4 years ago, this thread would have 45-50 pages.
 
Why would you want an iPad to flex? Surely that wouldn't be very good for the internal components. Regardless, I doubt you could break it, they use synthetic sapphire to make aerospace windows and bulletproof armour. It is also far more scratch resistant than glass and highly transparent, which is why Apple just used it in the iPhone 5s.

http://www.gia.edu/gia-news-research-Sapphire-Series-Modern-Applications

The iPhone and iPad screens all have a little bit of "give" in terms of flexibility. This makes the screens durable. The point is not to enable the iPad to flex, but to withstand pressure from a finger. If they would use real glass at the thickness of the current display it would break within a few minutes.

Aerospace windows can be made from glass because they are several centimeters thick. Bulletproof armour is not made from sapphire.

In all, it's the same principle why steel used in buildings is flexible and not rigid.
 
-A 2X jump up to 155Gflops (this is what the Xbox 360's GPU performs at)

Where are you getting the 155Gflops figure for xbox360's GPU?

Sadly, all the sources I've seen say its GPU (an ATI Xenos) has 240Gflops: wiki, techPowerup

I was hoping the iPad 5/air might bump up x3 (instead of usual x2) by using a hypothetical A7X with G6630 (instead of upclocked G6430), which would match the xbox360... or x4, to be the fastest "console" (until the next gen launch in November). But know.

However, at least the 2014 iPad will beat an xbox360 - pretty cool for such a tiny device!
 
Where are you getting the 155Gflops figure for xbox360's GPU?

Sadly, all the sources I've seen say its GPU (an ATI Xenos) has 240Gflops: wiki, techPowerup

I was hoping the iPad 5/air might bump up x3 (instead of usual x2) by using a hypothetical A7X with G6630 (instead of upclocked G6430), which would match the xbox360... or x4, to be the fastest "console" (until the next gen launch in November). But know.

However, at least the 2014 iPad will beat an xbox360 - pretty cool for such a tiny device!

I think that number is either the max out put or it is GPU and CPU. Could you tell me; have you used the ask apple thing in there website before? Where you can talk to a specialist; is it free (they say they take no commission)?
 
They should have reduced the bezel size by increasing screen size not the opposite. I never had a problem with the iPads size.:confused:
 
It was unclear whether the Retina iPad mini would be ready for Tuesday's announcements, but in addition to The Wall Street Journal, two reliable sources, KGI Securities analyst Ming-Chi Kuo and AllThingsD, have also indicated that the Retina iPad mini will make its debut on October 22.

Alongside a Retina screen, the iPad mini is expected to include an A7 processor, while the full-sized iPad will offer an A7X processor. Both iPads will include upgraded cameras, and could also offer Apple's new M7 motion tracking chip. AllThingsD has also indicated that Haswell MacBook Pros will be unveiled during the event.

Guess they were wrong about the A7X.... They got everything else it right.
 
I think that number is either the max out put or it is GPU and CPU.

No, have a look at the two links I gave - they are specifically for the GPU. The CPU gives an additional 115.2 Gflops. (Note: the CPU and GPU have confusingly similar names, Xenon and Xenos respectively.)

Not sure what you're getting at with "max output" - usually, the max output is what is listed (for the given frequency in the device). That's what we have for Apple's devices.

But where is your figure from? That was my question...

Could you tell me; have you used the ask apple thing in there website before? Where you can talk to a specialist; is it free (they say they take no commission)?

Sorry, I don't know, I hadn't even noticed it actually. Why not give it a go? I can't see how they could charge you without knowing who you are.
 
No, have a look at the two links I gave - they are specifically for the GPU. The CPU gives an additional 115.2 Gflops. (Note: the CPU and GPU have confusingly similar names, Xenon and Xenos respectively.)

Not sure what you're getting at with "max output" - usually, the max output is what is listed (for the given frequency in the device). That's what we have for Apple's devices.

But where is your figure from? That was my question...



Sorry, I don't know, I hadn't even noticed it actually. Why not give it a go? I can't see how they could charge you without knowing who you are.

2X iPad 4 = 155Gflops
But in the case of the 5S, it's2X jump was infact 2.67(28.8 to 76.8) so that adds to 205Gflops.

Sorry, your right about the 360's GPU and CPU. But I know (maybe lol) that the PS3 is 174Gflops GPU.

And it is free.
 
But in the case of the 5S, it's2X jump was infact 2.67(28.8 to 76.8) so that adds to 205Gflops.

Oh man, I'm really not following your arithmetic there... where does the 205Gflops come from? It almost looks like you added the old and new gflops, then added another 100...

As a starting point, I agree that graphics power is:
iPhone 5: 28.8
iPhone 5S: 76.8
iPad 4: 76.8
iPad 5(air): 153.6
Sorry, your right about the 360's GPU and CPU. But I know (maybe lol) that the PS3 is 174Gflops GPU.

Hey, you're right about PS3 176Gflops. Funny, I always thought it was more powerful than the xbox360... (Maybe that was the CPU).

Glad it's free!
 
Oh man, I'm really not following your arithmetic there... where does the 205Gflops come from? It almost looks like you added the old and new gflops, then added another 100...

As a starting point, I agree that graphics power is:
iPhone 5: 28.8
iPhone 5S: 76.8
iPad 4: 76.8
iPad 5(air): 153.6


Hey, you're right about PS3 176Gflops. Funny, I always thought it was more powerful than the xbox360... (Maybe that was the CPU).

Glad it's free!

Sorry for not explaining it best, was very tied. Right,

iPhone 5S for a 2X jump in GPU ok, but that affected the flops by 2.67X. i.e the jump from 28.8 to 76.8.
It is most likely the same will happen with the iPad Air, as the affect on flops will varie the same. So the 2X jump, for flops, is 2.67X.

2.67 X 76.8= 205Gflops

:)
 
Sorry for not explaining it best, was very tied. Right,

iPhone 5S for a 2X jump in GPU ok, but that affected the flops by 2.67X. i.e the jump from 28.8 to 76.8.
It is most likely the same will happen with the iPad Air, as the affect on flops will varie the same. So the 2X jump, for flops, is 2.67X.

2.67 X 76.8= 205Gflops

:)

Thanks for explaining! It's so usual for a company to under-claim their improvements (I guess "x2.67" is more awkward than "x2" or "x3"). I'm not sure that it will transfer to the iPad air though, if they just increased the frequency. (The reason I think they just increased frequency was because they used the same A7 name, and I think Gflops would be much higher if they'd switched to 6 cluster G6630)

But maybe you're right. And that would beat a PS3, and be close to an xbox360. Certainly in the same class as consoles. I guess it depends on what the actual games look like, and what they do with it.

It's striking how they did not focus on games, with all that power and with game controller support in iOS 7... I mean, they didn't even run the Infinity Blade III demo that they had for the iPhone 5S launch. Makes me think a specific new gaming product product is coming... (OTOH, with x4 pixels and only x2 the GPU, the iPads would be /2 the effective power of the iPhone 5S - I really they should have iPad:iPhone GPU power be in ratio 4:1, to match the pixel ratio of 4:1).
 
Last edited:
Thanks for explaining! It's so usual for a company to under-claim their improvements (I guess "x2.67" is more awkward than "x2" or "x3"). I'm not sure that it will transfer to the iPad air though, if they just increased the frequency. (The reason I think they just increased frequency was because they used the same A7 name, and I think Gflops would be much higher if they'd switched to 6 cluster G6630)

But maybe you're right. And that would beat a PS3, and be close to an xbox360. Certainly in the same class as consoles. I guess it depends on what the actual games look like, and what they do with it.

It's striking how they did not focus on games, with all that power and with game controller support in iOS 7... I mean, they didn't even run the Infinity Blade III demo that they had for the iPhone 5S launch. Makes me think a specific new gaming product product is coming... (OTOH, with x4 pixels and only x2 the GPU, the iPads would be /2 the effective power of the iPhone 5S - I really they should have iPad:iPhone GPU power be in ratio 4:1, to match the pixel ratio of 4:1).

Right; the whole "2.67X jump instead of 2X" will come to the iPad Air as well. See, the reason it is a 2.67X jump, is because of the following:

Apple said iPhone 5S GPU is 2X faster. They were not lying, it is just the affect it had on flops was 2.67. We will get the same effect on flops. That is what I meant to mention in my last post.

The iPad Air WILL be the first tablet to out perform a current gen console (PS3/XBox 360). The competition a very far behind because NVidia's mobile GPU's as set to have the performance of 150 Gflops at the end of next year! The mobile team at NVidia are idiots because when they shown the Tegra 4 off, they said the iPad 4 looks like a "1999 game". Tegra 4 was only slightly better than A6X, and the A6X was already 5-6 months old. A7 in the 5S is equal to Tegra 4/ better and the A7 for the iPad Air destroys it(205Gflops). Plus iOS is much more optimized than Android which helps in the graphics department as it is having to render less in the background.


So are you looking forward to the iPad Air then? I certainly am :)
 
Apple said iPhone 5S GPU is 2X faster. They were not lying, it is just the affect it had on flops was 2.67. We will get the same effect on flops.

I disagree, because I believe the cause of the increase is different in the iPad Air vs iPhone 5S (my previous comment has details). But you could be right :) We'll see!

EDIT wait, do you mean that Apple will want to preserve the 1:2 GPU ratio between iPhone and iPad? That's pretty compelling actually...
A7 in the 5S is equal to Tegra 4/ better and the A7 for the iPad Air destroys it(205Gflops).

I agree it's difficult for Nvidia to compete with Imagination on low-power consumption devices - their whole company has been organized around increasing performance, and it's difficult to change direction. I feel bad for them. Looks like they're heading up-market, to scientific computing (using CUDA for GPGPU). Even worse for them, there's been talk of Imagination coming out with a video board, thus invading Nvidia's homeground... And, actually, Intel already has a x86 SoC with an Imagination GPU (can't recall which, probably an Atom).
Plus iOS is much more optimized than Android which helps in the graphics department as it is having to render less in the background.

Also agree that iOS is more optimized - and not just the OS, also the standard language (objective-C) is closer to the metal than Java, and additionally iOS "developer culture" has been more focused on fluid UI.

Of course, with so much performance to spare, that culture may start to shift... favoring developers who produce great results more quickly (i.e. with less effort on optimizing). That's how technology usually evolves. But games will stay optimized of course! And the shift will be softened by developers wanting to target older and slower iOS devices, to increase market size.
 
Last edited:
I disagree, because I believe the cause of the increase is different in the iPad Air vs iPhone 5S (my previous comment has details). But you could be right :) We'll see!

EDIT wait, do you mean that Apple will want to preserve the 1:2 GPU ratio between iPhone and iPad? That's pretty compelling actually...


I agree it's difficult for Nvidia to compete with Imagination on low-power consumption devices - their whole company has been organized around increasing performance, and it's difficult to change direction. I feel bad for them. Looks like they're heading up-market, to scientific computing (using CUDA for GPGPU). Even worse for them, there's been talk of Imagination coming out with a video board, thus invading Nvidia's homeground... And, actually, Intel already has a x86 SoC with an Imagination GPU (can't recall which, probably an Atom).


Also agree that iOS is more optimized - and not just the OS, also the standard language (objective-C) is closer to the metal than Java, and additionally iOS "developer culture" has been more focused on fluid UI.

Of course, with so much performance to spare, that culture may start to shift... favoring developers who produce great results more quickly (i.e. with less effort on optimizing). That's how technology usually evolves. But games will stay optimized of course! And the shift will be softened by developers wanting to target older and slower iOS devices, to increase market size.

What I have been thinking though; is objective C much more optimized/efficient that the program langue of a PS3 or XBox 360? Because games like Asphalt 8 were made for the iPhone 5 and the overall that is only 28.8Gflops. Tiny compared to a console but the graphics and game play are not that far off. So if (which I believe is very likely) the iPad Air's GPU overtakes the PS3, doesn't that mean that mean I could do considerably better?

Say the iPhone 5 is 1/10th of the GPU power of an XBOX 360(28.8-240 just roughly). If we then match xbox, doesn't that mean we could be weeing graphics 4-5 times better? Just a very rough thought that the language may be more optimized.

Objective C is one of the only languages I have not programmed with.

The Apple Team at apple chat said that the iPad Air will be released for reviews on the 1st. So teardowns on the 1st or 2nd. I confirmed this; and they said that it was the iPhones that got sent out early but the iPad's won't.
 
What I have been thinking though; is objective C much more optimized/efficient that the program langue of a PS3 or XBox 360? Because games like Asphalt 8 were made for the iPhone 5 and the overall that is only 28.8Gflops. Tiny compared to a console but the graphics and game play are not that far off. So if (which I believe is very likely) the iPad Air's GPU overtakes the PS3, doesn't that mean that mean I could do considerably better?

Say the iPhone 5 is 1/10th of the GPU power of an XBOX 360(28.8-240 just roughly). If we then match xbox, doesn't that mean we could be weeing graphics 4-5 times better? Just a very rough thought that the language may be more optimized.

Objective C is one of the only languages I have not programmed with.

My understanding is that objective C is like C with objects and (optional) memory management. (if you think of Java as a way to make C++ nicer, Objective-C is another way to make it nicer, but closer to C++ than to Java). So, usually, it will be less efficient than the assembly and C++ that most console games use. Although, I guess, you could write Objective C that was just as efficient, since you have access to low level stuff. But the bottom line is: no, Objective-C in itself won't make iOS games faster than console games.

Also, most of the graphics are handled by the GPU, which doesn't use Objective C, but another C-like shader language (again, my understanding).

If an iPhone 5 game seems close to an xbox360, it may be because of fewer pixels. The xbox is 1920×1080, the iPhone 5 is 640×1136. But to be fair, many xbox games don't run at full resolution. (maybe iPhone games don't either?)

I guess the other thing is that not all xbox games have optimized graphics (esp B-grade titles and older titles). You need to compare the best xbox games with the best iPhone games.

For example, most Call of Duty games are 60 fps, which makes them feel very fluid. The latest GTA V probably has the best graphics (and it looks like 1080p to me). Just Cause 2 also runs at 1080p. Even the much older title Far Cry 2 (2, not 3), although it runs at a lower resolution, still looks incredibly realistic much of the time. Games that use id's RAGE engine look extraordinary (though sadly the gameplay is really bad in the ones I've seen - no fault of the graphics engine).

To be fair, as years pass, games for a particular console get better because developers learn how to optimize for that specific platform, so that the most impressive games come out at the end of a console's life. So perhaps comparing recent xbox games to iPhone games isn't fair, because the xbox is a few years older.

I guess one last point is that you can have fantastic gameplay without powerful graphics, so that's not really a measure of the platform's power. (arguably, gameplay might get worse, because as hi-res assets become more expensive to make, you don't want to take risks with them, or make changes and throw them out!)

It might seem possible that the iPhone architecture somehow is better suited to games than the xbox architecture - things like bandwidth to memory, and the way the iOS handles it. While possible, this seems unlikely to me, as the xbox architecture has been specifically designed for games, the OS is minimal, and the graphics interface (directx) has been honed over a far greater number of years in the PC, and with many iterations.

But there is one thing in favour of the iPhone: while microsoft has had two goes at it (xbox and xbox360), apple has had 7 goes (1st,3G,3GS,4,4S,5,5S) - and with each iteration, they can incorporate lessons learnt from the last one. They can try things, and see how they go. Iteration is good for improvement! And... (actually this one is pretty strong!) since Apple started designing its own chips, they can really integrate stuff well, so that it is optimized to work together. I think the CPU and GPU of the xbox were made by different companies - and they're in different chips. I would guess that excellent integration of GPU and CPU in the iPhone might not be accounted for by the GPU Gflops score alone (nor by the CPU Gflops). But this is all just guessing - if it were significant, you'd think tech reviewers would have explored it.

On the downside, Apple will be straining for power efficiency, which consoles never have to worry about.

Oh, yet another thing: touch screens have very high latency compared to game controllers. I think this makes it difficult to perceive the responsiveness of the graphics.

Sorry, that's a long answer, just dumping all the stuff I could think of. It would be interesting to hear what an expert (like John carmack, who wrote DOOM, Quake etc) has to say. He actually ported his RAGE engine to an older iPhone, and all the iPhone developers freaked out that he could get such high performance out of it. So... he would know.
The Apple Team at apple chat said that the iPad Air will be released for reviews on the 1st. So teardowns on the 1st or 2nd. I confirmed this; and they said that it was the iPhones that got sent out early but the iPad's won't.
Thanks for letting me know - something to look forward to!

My bet: the iPad Air GPU is a G6430 (not G6630), same as the iPhone 5S, just x2 clocked (though they probably can't determine the clocking?)
 
Last edited:
My understanding is that objective C is like C with objects and (optional) memory management. (if you think of Java as a way to make C++ nicer, Objective-C is another way to make it nicer, but closer to C++ than to Java). So, usually, it will be less efficient than the assembly and C++ that most console games use. Although, I guess, you could write Objective C that was just as efficient, since you have access to low level stuff. But the bottom line is: no, Objective-C in itself won't make iOS games faster than console games.

Also, most of the graphics are handled by the GPU, which doesn't use Objective C, but another C-like shader language (again, my understanding).

If an iPhone 5 game seems close to an xbox360, it may be because of fewer pixels. The xbox is 1920×1080, the iPhone 5 is 640×1136. But to be fair, many xbox games don't run at full resolution. (maybe iPhone games don't either?)

I guess the other thing is that not all xbox games have optimized graphics (esp B-grade titles and older titles). You need to compare the best xbox games with the best iPhone games.

For example, most Call of Duty games are 60 fps, which makes them feel very fluid. The latest GTA V probably has the best graphics (and it looks like 1080p to me). Just Cause 2 also runs at 1080p. Even the much older title Far Cry 2 (2, not 3), although it runs at a lower resolution, still looks incredibly realistic much of the time. Games that use id's RAGE engine look extraordinary (though sadly the gameplay is really bad in the ones I've seen - no fault of the graphics engine).

To be fair, as years pass, games for a particular console get better because developers learn how to optimize for that specific platform, so that the most impressive games come out at the end of a console's life. So perhaps comparing recent xbox games to iPhone games isn't fair, because the xbox is a few years older.

I guess one last point is that you can have fantastic gameplay without powerful graphics, so that's not really a measure of the platform's power. (arguably, gameplay might get worse, because as hi-res assets become more expensive to make, you don't want to take risks with them, or make changes and throw them out!)

It might seem possible that the iPhone architecture somehow is better suited to games than the xbox architecture - things like bandwidth to memory, and the way the iOS handles it. While possible, this seems unlikely to me, as the xbox architecture has been specifically designed for games, the OS is minimal, and the graphics interface (directx) has been honed over a far greater number of years in the PC, and with many iterations.

But there is one thing in favour of the iPhone: while microsoft has had two goes at it (xbox and xbox360), apple has had 7 goes (1st,3G,3GS,4,4S,5,5S) - and with each iteration, they can incorporate lessons learnt from the last one. They can try things, and see how they go. Iteration is good for improvement! And... (actually this one is pretty strong!) since Apple started designing its own chips, they can really integrate stuff well, so that it is optimized to work together. I think the CPU and GPU of the xbox were made by different companies - and they're in different chips. I would guess that excellent integration of GPU and CPU in the iPhone might not be accounted for by the GPU Gflops score alone (nor by the CPU Gflops). But this is all just guessing - if it were significant, you'd think tech reviewers would have explored it.

On the downside, Apple will be straining for power efficiency, which consoles never have to worry about.

Oh, yet another thing: touch screens have very high latency compared to game controllers. I think this makes it difficult to perceive the responsiveness of the graphics.

Sorry, that's a long answer, just dumping all the stuff I could think of. It would be interesting to hear what an expert (like John carmack, who wrote DOOM, Quake etc) has to say. He actually ported his RAGE engine to an older iPhone, and all the iPhone developers freaked out that he could get such high performance out of it. So... he would know.

Thanks for letting me know - something to look forward to!

My bet: the iPad Air GPU is a G6430 (not G6630), same as the iPhone 5S, just x2 clocked (though they probably can't determine the clocking?)

Thanks for telling me all that. The longer the answer; the better!

Really looking forward to Anandtechs tear down. Can't wait to see how to competes against the rest.

If you look at the geekbench score for the 5S, it is basically double. The same should happen with the Air so the Air is looking at a score of around 3500 (1750X2). That is the score of a 2011 MacBook Air 11"! Great how far things have come, eh :)
 
Thanks for telling me all that. The longer the answer; the better!

Really looking forward to Anandtechs tear down. Can't wait to see how to competes against the rest.

If you look at the geekbench score for the 5S, it is basically double. The same should happen with the Air so the Air is looking at a score of around 3500 (1750X2). That is the score of a 2011 MacBook Air 11"! Great how far things have come, eh :)

You mean http://browser.primatelabs.com/ios-benchmarks ? (Isn't that for CPU, not GPU?)
Where is the 1750 figure from?
I think we must be looking at different pages!

Also I couldn't find the 2011 MacBook Air 11" in the mac page http://browser.primatelabs.com/mac-benchmarks (EDIT: found MacBook Air in multi-core list, it's not in single-core)
 
Last edited:
You mean http://browser.primatelabs.com/ios-benchmarks ? (Isn't that for CPU, not GPU?)
Where is the 1750 figure from?
I think we must be looking at different pages!

Also I couldn't find the 2011 MacBook Air 11" in the mac page http://browser.primatelabs.com/mac-benchmarks (EDIT: found MacBook Air in multi-core list, it's not in single-core)

It was roughly 1750 for the iPad 4 multicore score.

----------

You mean http://browser.primatelabs.com/ios-benchmarks ? (Isn't that for CPU, not GPU?)
Where is the 1750 figure from?
I think we must be looking at different pages!

Also I couldn't find the 2011 MacBook Air 11" in the mac page http://browser.primatelabs.com/mac-benchmarks (EDIT: found MacBook Air in multi-core list, it's not in single-core)

Yeah, it is CPU, but with the series 6 chips taking off stain on the CPU, there has and will be a massive increase. Really just another comparison to show how great the next iPad will be :)

This is a cool we've got going here with the comments every day :)
 
It was roughly 1750 for the iPad 4 multicore score.

Are you sure? I'm seeing 1407, 1406 and 1396 for the iPad 4 multicore core... have a look: http://browser.primatelabs.com/ios-benchmarks
Yeah, it is CPU, but with the series 6 chips taking off stain on the CPU, there has and will be a massive increase. Really just another comparison to show how great the next iPad will be :)
You're thinking of GPGPU? Yeah, I've been thinking that multicore doesn't help much in a general purpose CPU (dual-core really helps, but quad-core it already isn't even really worth it, it seems to me). However, it works fantastically for GPUs... so for the people wondering why we don't have massively many-core CPUs yet... we do... it's just that they're GPUs. Supercomputers are now being made from them, and that's the area invidia's going into.

Of course, a GPU architecture is not as general as a CPU architecture, but I believe that that's just a limitation we have to face. We're *not* going to be able to make single threaded apps multi-core very easily; even with a lot of work, we still can't get the full benefit. People are optimistic about functional programming, but with decades of work, I haven't seen any real solutions.

I believe that the only tasks we are *ever* going to be able to parallelize are the so-called "embarassingly parallelizable" - tasks that are trivially decomposable into tasks that can be run independently... like map-reduce, like serving many web-requests, like ray-tracing and shaders.

We will only make progress by finding more tasks like that (or by changing our perspective, to find a way to *see* an approach to an existing task like that). And when we do, they will (likely) fit into a GPU's architecture. [counterpoint: I guess we might have independent tasks that need a little more power than GPU units, e.g. I think they have no stack, so can't do recursion].

And when that happens, Apple will already have over-powered GPUs ready and waiting.

There's actually hard evidence of this thinking: one of Apple's bigger machines has two GPUs, and only one can be used for graphics - the other must be used for computing. I'm not sure the market is ready for that yet, but it proves that Apple has an interest in that direction.
</RANT>
 
Are you sure? I'm seeing 1407, 1406 and 1396 for the iPad 4 multicore core... have a look: http://browser.primatelabs.com/ios-benchmarks

You're thinking of GPGPU? Yeah, I've been thinking that multicore doesn't help much in a general purpose CPU (dual-core really helps, but quad-core it already isn't even really worth it, it seems to me). However, it works fantastically for GPUs... so for the people wondering why we don't have massively many-core CPUs yet... we do... it's just that they're GPUs. Supercomputers are now being made from them, and that's the area invidia's going into.

Of course, a GPU architecture is not as general as a CPU architecture, but I believe that that's just a limitation we have to face. We're *not* going to be able to make single threaded apps multi-core very easily; even with a lot of work, we still can't get the full benefit. People are optimistic about functional programming, but with decades of work, I haven't seen any real solutions.

I believe that the only tasks we are *ever* going to be able to parallelize are the so-called "embarassingly parallelizable" - tasks that are trivially decomposable into tasks that can be run independently... like map-reduce, like serving many web-requests, like ray-tracing and shaders.

We will only make progress by finding more tasks like that (or by changing our perspective, to find a way to *see* an approach to an existing task like that). And when we do, they will (likely) fit into a GPU's architecture. [counterpoint: I guess we might have independent tasks that need a little more power than GPU units, e.g. I think they have no stack, so can't do recursion].

And when that happens, Apple will already have over-powered GPUs ready and waiting.

There's actually hard evidence of this thinking: one of Apple's bigger machines has two GPUs, and only one can be used for graphics - the other must be used for computing. I'm not sure the market is ready for that yet, but it proves that Apple has an interest in that direction.
</RANT>

Never knew Apple were in on that; thought that it was still at least 5 years away. Am I right in thinking that the addon options for the iMac is maximum of 3 GPU's? Because if they are, and each GPU is dual core, that means there would be a total of 6cores. I wonder if this would be a good investment? Or maybe doing it on a Mac Pro?

Anyway, could you tell what the CPU would then do, something I have always wondered....

Because the way I thought it would work is that the CPU would stay but get less powerful every time the GPU gets more powerful working in inverse proportion.

As for the iPad Air. When will you be getting yours? In getting mine on the release day (64Gb). But can't choose between Space Grey, or Silver. Does white fade/go yellow/get dirty easily?
 
Because the way I thought it would work is that the CPU would stay but get less powerful every time the GPU gets more powerful working in inverse proportion.
I think they'd make the CPU as powerful as possible (in tradeoff with power consumption), it's just that it's limited. But the GPU would keep getting more powerful. The CPU is still needed for managing the GPU.

But... if the CPU was *only* doing that management, there might be a dramatic shrink, exactly as you say. I think the problem is there will still be lots of tasks that only the CPU can do.

As for the iPad Air. When will you be getting yours? In getting mine on the release day (64Gb).

Not yet. I love the iPad concept, but it doesn't meet my particular needs, so far. And mostly I do development, so I really need a keyboard. My next device might be the Acer C720 chromebook, but I'd like something smaller (7" maybe, like a mini). I could attach a keyboard to an iPad, but it kind of wrecks it, and the iPad ecosystem doesn't have much support for development on-the-unit.

The input lag also puts me off... though it's getting better every generation. Have you seen the MS expt, where they show what it's like to have less lag? When it gets below 1ms (IIRC), it becomes like a real object. The iPad 4 was at about 70ms. That's about twice as fast as the competition (Samsung), but a long way to go.

Ironically, for development so far, it's all in the CPU, the GPU isn't used at all. And x86 have been better at raw power than ARM processors... but with Apple x2 every year, they are closing the gap.

But can't choose between Space Grey, or Silver. Does white fade/go yellow/get dirty easily?

I don't think the white would fade or go yellow, but I guess it might get dirty easily. Hey, lots of people already have white iPads, why not ask the forum or google it?
 
But can't choose between Space Grey, or Silver. Does white fade/go yellow/get dirty easily?

I don't think the white would fade or go yellow, but I guess it might get dirty easily. Hey, lots of people already have white iPads, why not ask the forum or google it?

The white doesn’t really get dirty unless you’re using it while doing an oil change or something :) Our white iOS devices get significant use - including a white iPad used by a 5-1/2 year old - and while they pick up fingerprints and the occasional dark smudge, it’s so easily wiped clean, it’s not a big deal. I also don’t see much difference in using it for things like video (some people think the darker bezel increases the perceived brightness/contrast/etc.

We’re probably going to pick up a Mini Retina, and it’ll be white too.

Enjoy the new Air :cool:
 
The white doesn’t really get dirty unless you’re using it while doing an oil change or something :) Our white iOS devices get significant use - including a white iPad used by a 5-1/2 year old - and while they pick up fingerprints and the occasional dark smudge, it’s so easily wiped clean, it’s not a big deal. I also don’t see much difference in using it for things like video (some people think the darker bezel increases the perceived brightness/contrast/etc.

We’re probably going to pick up a Mini Retina, and it’ll be white too.

Enjoy the new Air :cool:

Thanks :) Still undecided though! I think it will be either a black iPad with back Smart Case, or white iPad with black Smart Case.
 
Thanks :) Still undecided though! I think it will be either a black iPad with back Smart Case, or white iPad with black Smart Case.

I had all black iOS devices up until I picked up our iPad 3 a couple of years ago, and started really digging on white. This is totally subjective (as there’s no measurable difference), but the _experience_ is more interesting to me. I don’t know if it’s the “tech-of-the-future” vibe the white device gives off, or the differentiation vs. all the other black bricks, but I really prefer it now.

We’ve still got black iOS and Android devices in the house, all our other electronics are black (AV systems/AppleTVs/TVs/etc.), so it’s a nice change.

My current iPhone 5 is white too (with a white/silver Elago slim case). The white iPad 4 has an iCarbon white carbon rear protector with a black logo :D
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.