Which GPU is 9X faster?

firewood

macrumors 604
Original poster
Jul 29, 2003
7,645
872
Silicon Valley
Which combination of Imagination Technologies's GPUs is 9X faster than the one in the current A4 chip?

And can it do OpenCL?
 

MythicFrost

macrumors 68040
Mar 11, 2009
3,929
38
Australia
Well, this is a very complex question as I've found.

The SGX543 has 2.5x more performance than the SGX535 in the iPad at the same clock speed, so the SGX543MP2 which is rumoured to be in the iPad 2 would have 5x more performance.

I'd say it'll be an SGX543MP2 that has a higher clock speed than the SGX535 in the previous iPad.

There are many factors though, I can't find any solid info on how many triangles the iPad can output per second, or the iPhone 3GS, 4, iPod touch 3, 4, for that matter.

From what I've read here, the SGX535 is capable of 14m triangles per second at 200MHz. But, I'm not sure if that's true or not.
 

deeddawg

macrumors G3
Jun 14, 2010
8,298
2,175
US
and technically he said "up to" 9x faster. Meaning it's likely not 9x faster across the board, but only in some particular area. No slight intended, just typical marketing speak.
 

jmpnop

macrumors 6502a
Aug 8, 2010
821
34
and technically he said "up to" 9x faster. Meaning it's likely not 9x faster across the board, but only in some particular area. No slight intended, just typical marketing speak.
Agree with this, its only up to 9x faster. So, in some *unrealistic testing conditions* you may see 9x speed over the first gen. Practically its less than that.
 

foiden

macrumors 6502a
Dec 13, 2008
803
0
Yep. That's actually how just about every graphic hardware is marketed. It's always *up to* because based on exact specifics on what is used, the speed upgrade would vary. Unfortunately, for the sake of exact numbers, software uses a wide variation of graphic technologies between applications. So the speed increase would obviously differ based on them. Now when they target specific software, they can quote more exact performance increase numbers.
 

Piggie

macrumors G3
Feb 23, 2010
8,297
2,487
Yep. That's actually how just about every graphic hardware is marketed. It's always *up to* because based on exact specifics on what is used, the speed upgrade would vary. Unfortunately, for the sake of exact numbers, software uses a wide variation of graphic technologies between applications. So the speed increase would obviously differ based on them. Now when they target specific software, they can quote more exact performance increase numbers.
I don't really think this is the case for the high end desktop PC market as they know they'd be ripped to shreds in seconds if they made such distorted claims.

If AMD (ATI) Launched a new graphics card and they say it's twice as fast as the old one, then in benchmarks and games, it dam well better be, else as I say they would be ripped to shreds.

They would never be allowed to get away with printing "up to 9 times faster" on the box and have it run at only twice the speed in typical benchmarks and games.

I would like to see a "typical" speed increase which can realistically be expected to be achieved in the real world, than for example how fast it can fill an untextured polygon with a solid colour 9x faster,

To be honest, if it's twice as fast in the real world then I'd be happy.

I really look forward to seeing games for iPad1 being enhanced to take advantage of the new speed, and perhaps better textures on the iPad2.

2x faster is a great bonus on it's own, to go from say 15 frames/sec to 30 frames sec would be lovely.
 

MRU

Suspended
Aug 23, 2005
25,312
8,706
Other
The SGX543MP2 has exactly 9 times the GFlops/sec compared to the SGX535 at the same clock.
I'm really hoping for the SGX543MP2

Dual core graphics would be an awesome upgrade alongside the dual core a5.
 

jclardy

macrumors 68040
Oct 6, 2008
3,398
1,947
Maybe it is the SGX543MP4 like Sony's NGP.


If so then so much for them being "a year ahead" of everyone else in terms of GPU speed.

I think it is possible if the SGX535 can put out 14m tris/sec. According to this page (http://www.imgtec.com/news/release/index.asp?newsid=449) the SGX543MP4 can put out 133m tris/sec.

133/14 = 9.5

But that may just be wishful thinking. Either way it is quite a substantial upgrade.

There is a video comparison of the MP2 on Engadget here, compared to the 535 in the Nexus S I believe (http://www.engadget.com/2011/03/03/imagination-technologies-powervr-sgx543mp2-really-is-faster-be/)
 

fertilized-egg

macrumors 68020
Dec 18, 2009
2,094
10
...
If AMD (ATI) Launched a new graphics card and they say it's twice as fast as the old one, then in benchmarks and games, it dam well better be, else as I say they would be ripped to shreds.
...
I would like to see a "typical" speed increase which can realistically be expected to be achieved in the real world, than for example how fast it can fill an untextured polygon with a solid colour 9x faster,
You make it sound like Apple is the only one playing that game. They aren't. Samsung had proudly announced their Galaxy S phone was "at least three times the power of other smartphones" yet in actual games it wasn't anywhere near that.

Also it's very difficult to predict how fast these chips will be just based on the GPU spec sheet. Here's a case in point



There are other benchmarks done at Anandtech but the results are similar. Despite using the "older" SGX540 GPU (same as the Hummingbird), with its higher clock speed and faster memory the OMAP 4430 beats everyone else on the chart, including Tegra2. There are just too many variables and the real test will be how the developers achieve the end result. And Apple has an upper hand there by default.
 

foiden

macrumors 6502a
Dec 13, 2008
803
0
I don't really think this is the case for the high end desktop PC market as they know they'd be ripped to shreds in seconds if they made such distorted claims.

If AMD (ATI) Launched a new graphics card and they say it's twice as fast as the old one, then in benchmarks and games, it dam well better be, else as I say they would be ripped to shreds.

They would never be allowed to get away with printing "up to 9 times faster" on the box and have it run at only twice the speed in typical benchmarks and games.
^ This would work, if the games didn't vary so much in how much they speed up in tests. It's why they always grab a group of games to test it with, since you'll often see the numbers fluctuate from one to the other.

One might put in Crysis, see one improvement, and then Doom 3 and see a totally different speed increase. Then they put in Starcraft 2, where they might actually see no increase, or something. Then they test Minecraft, see something completely different there. It would be something if every game ran on the Unreal Engine, and utilized the same technologies it has across the board, but that's the real issue. Which benchmarks do you use? What is the typical?