Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
You're welcome to go do your own analysis and post the results. Instead you're more interested in discrediting people who put in a little effort.

I am certainly not interested in discrediting you or your opinion. Just pointing out — as a person who among other teaches statistics to graduate students at a university — that your methodology is flawed and your conclusions are not robust. I do not know whether you are right or not, simply because there is nothing I can use to argue that or other point.

As to the offer to do my own analysis — gladly, if data was readily available. But I'm not really in the mood to spend time writing a web crawler that would grab all the Geekbench results and put them into a detailed table.
 
I am certainly not interested in discrediting you or your opinion. Just pointing out — as a person who among other teaches statistics to graduate students at a university — that your methodology is flawed and your conclusions are not robust. I do not know whether you are right or not, simply because there is nothing I can use to argue that or other point.

As to the offer to do my own analysis — gladly, if data was readily available. But I'm not really in the mood to spend time writing a web crawler that would grab all the Geekbench results and put them into a detailed table.

I also never called the results I posted statistics; I provided my own observations and said they were interesting.
 
I don't really care about other results, do you have anything that proves there's no difference in gaming between the i7/16GB and i9/32GB like you claimed? *crickets*
I can show you gaming benchmarks comparing 8700 with 8700K on RX 470/570. Will that fit the bill?

There will not be difference in gaming performance on GPUs slower than GTX 1070, no matter the hardware you have, because you are GPU bound All of the time. MacBook Pro name will not change this, because that is simple PC computer, with typical x86 hardware.
 
I can show you gaming benchmarks comparing 8700 with 8700K on RX 470/570. Will that fit the bill?

There will not be difference in gaming performance on GPUs slower than GTX 1070, no matter the hardware you have, because you are GPU bound All of the time. MacBook Pro name will not change this, because that is simple PC computer, with typical x86 hardware.

I don't think that's true at all. RAM and CPU matter in gaming benchmarks too. That's why you can't slap in a GTX 1070ti into a PC with an Atom CPU with 4GB RAM and get the same results.
 
I don't think that's true at all. RAM and CPU matter in gaming benchmarks too. That's why you can't slap in a GTX 1070ti into a PC with an Atom CPU with 4GB RAM and get the same results.
Right...

And what happens when you have 6 core Core i7 8700 with GTX 1060, or slower? Will CPU matter in this scenario? What difference you have between 16 and 32 GB of RAM, in games...?
 
Here is a film for you, comparing 8 GB's vs 16 GB, for games.

If you do not see a difference between 8 and 16 GB's, why do you believe there will be difference between 16 and 32 GB?

I think the sarcasm went right over your head.

All depends what else you have or want to have running in the background when you're playing a game too. I'm a 30~ Chrome tab user and I exceed 16GB RAM personally.
 
Here is a film for you, comparing 8 GB's vs 16 GB, for games.

If you do not see a difference between 8 and 16 GB's, why do you believe there will be difference between 16 and 32 GB?
I've seen games using 10 or 12GB RAM on APU.
 
What does it take to have someone with 560x and Vega 20 to display the bloody sensor screen from iStatMenu. For crying out loud, this is the whole point, I get it, Vega is faster, but is it a game changer? If it has 1050Ti performance at 1050Ti power requirements then it is just another AMD chip that's two years late.
 
He also posted these photos. https://drive.google.com/drive/mobile/folders/15XmHCu8w-pPBJtM9KL0u3qyA4qkmM1k1?usp=sharing

Bigger sink on the Vega 20 due to the HMB being off die I guess
Thanks! That would add some weight to the possibility that the power consumption in Vega as reported by Apple sensors includes the memory. @koyoot - you know a lot about AMD - what would be the math there? 16W for GDDR and half of it for HBM? So we would look at 8W added to the GPU TDP in Vega?
 
Thanks! That would add some weight to the possibility that the power consumption in Vega as reported by Apple sensors includes the memory. @koyoot - you know a lot about AMD - what would be the math there? 16W for GDDR and half of it for HBM? So we would look at 8W added to the GPU TDP in Vega?


10W difference on the Vega 16 model, likely more on the 20.
 
10W difference on the Vega 16 model, likely more on the 20.
Yeah, I've seen this, however the dude has two laptops that literally cost 10 times more than my first car that I bought as a teen, yet he can't afford to publish it in 4k so that we could read what was displayed in the sensor screen. I was trying to approximate it using screenshoted length of the bars from iStatMenus in his video comparing to my own, but at one moment it just struck me - wtf is wrong with me that I care so much about it, while apparently, nobody does. I called my broker to unload my AMD stock I own already when they disclosed 25% efficiency improvement by going to 7nm, I guess I just wanted to hedge my bet.
 
Cropped those pics to highlight the main differences for everyone. Old design first.

Screen Shot 2018-11-29 at 7.56.03 PM.png
Screen Shot 2018-11-29 at 7.56.37 PM.png
 
What does it take to have someone with 560x and Vega 20 to display the bloody sensor screen from iStatMenu. For crying out loud, this is the whole point, I get it, Vega is faster, but is it a game changer? If it has 1050Ti performance at 1050Ti power requirements then it is just another AMD chip that's two years late.
The GPU is faster than GTX 1050 Ti. And is hindered by rubbish drivers.
Thanks! That would add some weight to the possibility that the power consumption in Vega as reported by Apple sensors includes the memory. @koyoot - you know a lot about AMD - what would be the math there? 16W for GDDR and half of it for HBM? So we would look at 8W added to the GPU TDP in Vega?
HBM2 stack, at 1.5 Gbps uses 4W of power, because of very low voltage.

It is all about power delivery design. iStat Menus is reporting VRM power, not the GPU power. On GDDR5 GPUs there are two separate VRMs: for GPU and for the memory. Presumably: on HBM2 there will be one VRM, because memory subsystem is not separate from the GPU itself, hence the "power increase"

It is quite interesting, however. Using 45W TDP GPUs(560X) with more than 45W TDP CPUs will result in throttling of one of the chips. Now the throttlinggate makes much more sense, after all.


Edit: anyone has high resolution pics of the Chokes and Phases on the motherboard of new MBP? I would like to know who is the manufacturer of power phases, and what phase design we are looking at.

I called my broker to unload my AMD stock I own already when they disclosed 25% efficiency improvement by going to 7nm, I guess I just wanted to hedge my bet.
This is the single dumbest thing you have done in recent memory ;).

I think you clearly have zero idea about technology, and how hot AMD stock is, and will be in 2019 ;).
 
  • Like
Reactions: FrostyF
indeed…... the Vega 20 is orders of magnitude faster in pro 3D apps than the 1050 Ti. Games not so much, but for us 3D content creators Vega is Godsent….I ordered mine!

vega 20.jpg
 
  • Like
Reactions: FrostyF and 0-0
Even in games it is between 15 and 25% faster than GTX 1050 Ti.

However, it would be faster, in all of the purposes if it would have latest drivers, that have any sort of optimizations, not just generic GCN architecture driver compatibility.
 
This is the single dumbest thing you have done in recent memory ;).

I think you clearly have zero idea about technology, and how hot AMD stock is, and will be in 2019 ;).
LOL, I’m not going to argue here, I have a pretty bad track record when it comes to predicting what technology will sell, I think going all the way back to PCI vs Local Bus. When I saw first iPod I thought of it as overpriced POS with cassette desk player sound quality and don’t even ask me what was my opinion on iPad back in the days of its first release.

AMD just released rx590, where are the mid range Vega based cards? Too expensive to make?
 
Even in games it is between 15 and 25% faster than GTX 1050 Ti.

However, it would be faster, in all of the purposes if it would have latest drivers, that have any sort of optimizations, not just generic GCN architecture driver compatibility.

If you want games, NVIDIA is the way to go, AMD usually isn't properly optimised for games, will take a big hit in performance
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.