Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Stacc

macrumors 6502a
Jun 22, 2005
888
353

koyoot

macrumors 603
Jun 5, 2012
5,939
1,853
http://www.eurogamer.net/articles/digitalfoundry-2016-hands-on-with-mantis-burn-racing-on-ps4-pro
we already knew that the Pro graphics core implements a range of new instructions - it was part of the initial leak - but we didn't really know exactly what they could actually do. As we understand it, with the new enhancements, it's possible to complete two 16-bit floating point operations in the time taken to complete one on the base PS4 hardware. The end result from the new Radeon technology is the additional throughput required to making Mantis Burn Racing hit its 4K performance target, though significant shader optimisation was required on the part of the developer.

In short, there's more to PS4 Pro's enhancements than teraflop comparisons suggest - and we understand that there are more 'secret sauce' features still to be revealed. At the PlayStation Meeting, Sony staff told me that the enhancements made to the core hardware go beyond the checkerboard upscaling technology, and the new instructions certainly support Mark Cerny's assertion that the PS4 Pro possesses graphics features not found in AMD's current Polaris line of GPUs. Interesting stuff, and we look forward to learning more.
GCN4 can do 2 16-bit FP operation per clock. However current Polaris GPU architecture - can't.

I don't know what to think about this...

In this context, story that Microsoft Project Scorpio using AMD Zen/Vega as SoC might become more true than we think.

What is interesting in this context, Xbox always used EDRAM plus DDR memory in the system. We have seen shots of the SOC with DRAM chips that are is made from 12 memory chips accounting for 384 Bit memory bus.

And HBM2 still can work as EDRAM.
 

xsmi123

macrumors regular
Jun 30, 2016
134
50
Sylvania, OH
Why do we keep getting NVIDIA News in an AMD thread? I read the NVIDIA threads but it seems that every time I get an update about this thread, its NVIDIA news. I apologize if I'm coming off as rude, I'd just like to feel like the thread is staying coherent so to speak.
 
  • Like
Reactions: koyoot

ManuelGomes

macrumors 68000
Dec 4, 2014
1,617
354
Aveiro, Portugal
My bad but this was only related to the (odd?) change in fab. Stacc got the point, my issue was indeed related to how well these will stack up to both AMD's parts and the other NVidia cards on 16nm. AMD seems to be limited in freq and these seem to go higher, possibly due to arch.
 

koyoot

macrumors 603
Jun 5, 2012
5,939
1,853
My bad but this was only related to the (odd?) change in fab. Stacc got the point, my issue was indeed related to how well these will stack up to both AMD's parts and the other NVidia cards on 16nm. AMD seems to be limited in freq and these seem to go higher, possibly due to arch.
I posted long time ago in the Pascal thread that Nvidia will switch from TSMC to Samsung as their fab partner. Nothing new here. Same thing will happen with big GPUs, and most likely with Volta.

AMD will also refresh the Polaris GPUs on Samsung process(already did with Embedded GPUs). Both Vega GPUs, and Zen APUs will be produced on 16 nm FF+.
 

ManuelGomes

macrumors 68000
Dec 4, 2014
1,617
354
Aveiro, Portugal
I could see those in nMP and maybe that's why Apple is playing the waiting game.
Still, Vega 11 seems far away but it would fit nicely in the DX300 envelope. Vega 10 would fill the DX500/700 spots.
 

koyoot

macrumors 603
Jun 5, 2012
5,939
1,853
Anyone has any idea, how come RX 480 can get to 1.475 GHz on core clock and draw around 133W under load and peak at 149W?

It is lower amount than standard, reference models.

The only thing that would justify this level of power consumption is 1.175V on core, which is 0.08V lower, than reference design. But the GPU die on that voltages does not clock so high!

New silicon? New revision of process?
 

Ph.D.

macrumors 6502a
Jul 8, 2014
553
479
Anyone has any idea, how come RX 480 can get to 1.475 GHz on core clock and draw around 133W under load and peak at 149W?


Eh, a cherry-picked card sent to a compensated reviewer might be a reasonable guess. Wouldn't be the first time. The box was open when he got it, and he stated he was asked to review it, etc.

But remember, power consumption in clocked circuits is related to the voltage squared, so even a small reduction in voltage can make a big difference in power.
 

koyoot

macrumors 603
Jun 5, 2012
5,939
1,853
Eh, a cherry-picked card sent to a compensated reviewer might be a reasonable guess. Wouldn't be the first time. The box was open when he got it, and he stated he was asked to review it, etc.

But remember, power consumption in clocked circuits is related to the voltage squared, so even a small reduction in voltage can make a big difference in power.
I know this perfectly. The voltages are there, in the video displayed while benchmarking. Standard voltage under load for this GPU: 1.05v. Reference Voltage on reference RX 480 - 1.25v. Both in boost core clock cases.

There is a problem. Reference designs do not clock that high up on that low voltages. Remember, this GPU has 1288 MHz@1.05v vs 1266 MHz@1.175v reference. Proofs? Here you go:
Hdgkv0F.png

The only thing that could make it so is that the GPU is from new revision of silicon.

Look at the voltage curve and Frequency curve. And compare that to completely abstract 1.175v and 1475 MHz. Possible with previous versions of Silicon?

And to add to that all of this: Embedded versions of Polaris GPUs have 95W TBP(Typical Board Power) at completely stock, reference 1266 MHz but unknown voltage for Polaris 10 GPU, and under 50W TBP, for 1.4 GHz(!) Polaris 11 GPU. And RX460 which uses Polaris 11 die rarely goes down below 80W under load at 1.2 GHz core clocks.

So something must have changed, lately.
 

koyoot

macrumors 603
Jun 5, 2012
5,939
1,853
Or it's a cherry-picked card sent to a paid reviewer.
Previously Polaris was able to achieve its reference core clock of 1266 MHz at 1.085-1.1v. And under load the GPU would throttle, because of the lack of current required to power this level of frequency. Here, you have OC'ed GPU with lower voltage than this.

Then we have OC'ed core, at stock(reference) voltage, for reference model: 1.175v and 1475MHz. Previously, this GPU was not able to exceed 1400 MHz, and if it did, it required massive increase in voltage, up to 1.25v.

This is MASSIVE difference. This is not outlier, I am not making things up here.
 

Ph.D.

macrumors 6502a
Jul 8, 2014
553
479
Well, get back to us when there's more support of an across-the-board change. Until then, it's just one card.
 
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.