Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Faster refresh rates should really be the next goal after 4/5k. The 10.5 iPad makes a strong case for its value even in everyday use. It's a lot more compelling than going to 8k.
 
  • Like
Reactions: askunk
Faster refresh rates should really be the next goal after 4/5k. The 10.5 iPad makes a strong case for its value even in everyday use. It's a lot more compelling than going to 8k.

Plus 8K with such small pixels runs into Nyquest Frequency really quickly, unless the display is much larger than 27 inches.
 
Faster refresh rates should really be the next goal after 4/5k. The 10.5 iPad makes a strong case for its value even in everyday use. It's a lot more compelling than going to 8k.
That is something Apple could achieve with a display that has an integrated eGPU in it (or with iMac (Pro)). It could share 2 or 3 DP 1.3 channels to one display using some exotic MST implementation.
 
TB3 could also see an update to support at least DP 1.3/1.4.

It's coming but only for USB-C DP1.4 alt-mode not for TB3 alt-mode, to enable DP1.4 signals in TB mode require re-write the specification (TBv4) for the side channel signals, usb-c its easier to implement new signals in Alt-Mode.

Maybe the TB controllers supporting DP1.4 in USB-C alt-mode to be named TBv3.1 full backward compatible with TBv3 devices, but TB controllers Supporting DP1.4 in TB mode sure will include a deep protocol review and faster TB3 native data rates and maybe wont be compatible with legacy TB3 w/o an adapter at least.

About the 120Hz refresh rate, I remember the iMac 5k has the particular feature to refresh pixel areas in the internal display at independent rates, so I think I'm not wrong on the iMac 5K support 120HZ refresh rate on small screen areas, and it should be capable to deliver full 5K 120hz refresh with the RX580 gpu (I'll confirm later, as I just ordered a 5K iMac to replace both my tcMP and the older iMac I had).

Of course the new iMac support upto 60hz refresh on EXTERNAL 5K displays, which is good enough for Video content and common web wrowsing, maybe not as dramatic as 120hz on animations but very good.
 
Aaaaaand we have first review of Core i9 CPU. It is 10 core Core i9 7900X.

http://hexus.net/tech/reviews/cpu/107017-intel-core-i9-7900x-14nm-skylake-x/?page=7
8339a55e-5dc4-4640-94cb-d486d176db50.png

That is quite alarming...
 
  • Like
Reactions: JesperA
Well DP 1.4 cannot do it (it runs 5K and 8K at 60Hz) so maybe DisplayPort 1.5? Might need Thunderbolt 4 or a better HDMI spec.

DP1.4 can do 5K 120Hz, but need to be in the Display Stream Compression (the monitor will need a compressors also). Larger 10 bit color space plus 120Hz is a problem though for mainstream display controllers. Random, mainstream displays won't have it.
 
Plus 8K with such small pixels runs into Nyquest Frequency really quickly, unless the display is much larger than 27 inches.

How do you figure that? Our retinas don't really sample at fixed intervals. It's much more random. I doubt you would or could see aliasing the way a digital camera would, or all kinds of things in the world, such as various cloth weaves, etc., would cause visual freakouts (not to say some weird stuff couldn't happen under rare circumstances). BTW, I think you mean "Nyquist."
 
Last edited:
  • Like
Reactions: Jaho101
Typical Koyoot AMD fanboyism at its finest lol. Ignoring all other great results 7900X showed on other graphs.
And where I have said in that post ANYTHING about AMD?

Because I pointed out power consumption of the CPU, it is my fanboyism?

People on this forum will never stop to amaze with in absolutely negative way.

I suppose in Intel fanboy eyes I can be fanboy.

P.S. If I would be AMD fanboy I would post the benchmark, which Aiden posted. Would I not? I suppose he is AMD fanboy, because that is what he posted. I did not, why? Because the point of the post was not AMD, but 7900X power consumption.
 
And where I have said in that post ANYTHING about AMD?

Because I pointed out power consumption of the CPU, it is my fanboyism?

People on this forum will never stop to amaze with in absolutely negative way.

I suppose in Intel fanboy eyes I can be fanboy.

P.S. If I would be AMD fanboy I would post the benchmark, which Aiden posted. Would I not? I suppose he is AMD fanboy, because that is what he posted. I did not, why? Because the point of the post was not AMD, but 7900X power consumption.
The point of the post was to show one graph out of context that showed that an Intel chip could use a lot of power, while ignoring the fact that it was so much faster than the other chips that it was actually more efficient than many of the lower powered chips.

Hence "Typical Koyoot AMD fanboyism at its finest lol."
 
^ exactly what I wanted to say. I am pretty sure when ThreadRipper reviews come in, you will be singing in a different tune.
 
The point of the post was to show one graph out of context that showed that an Intel chip could use a lot of power, while ignoring the fact that it was so much faster than the other chips that it was actually more efficient than many of the lower powered chips.

Hence "Typical Koyoot AMD fanboyism at its finest lol."
Well obviously, for Intel fanboys, pointing out weak spots in their beloved hardware must be showing biases.

Where in that post I have posted ANYTHING about performance?

Stop reading things in my posts that are not there. It was not my intention anything else, but to show high power draw. Is it making me AMD fanboy, if I was considering this chip for my builds? You guys, are too biased to be open minded about hardware. I am sorry, but it has to be said to you.

Aiden, please, grow up. I have not said ANYTHING about performance. I have said only about power consumption, which funnily enough right now is not important. Who is now moving the goal post of efficiency?

That is exactly why people do not consider this forum reliable anymore.

And lastly, to all reading: just because you are offended does not mean you are right. I am not AMD fanboy, if anything I have always used Intel hardware(funnily enough I have never used AMD hardware in my life...).

Stop calling me fanboy of AMD. Open your mind. And stop reading in my posts things which are not there.
 
Well obviously, for Intel fanboys, pointing out weak spots in their beloved hardware must be showing biases.
Why is the fact that a super-chip uses more power than lesser chips an alarming "weak spot"?

Especially on a day when you post news about 300 watt and 375 watt Vega cards....
 
  • Like
Reactions: tuxon86
Why is the fact that a super-chip uses more power than lesser chips a "weak spot".

Especially on a day when you post news about 300 watt and 375 watt Vega cards.
It is not a super chip. This is only 10 core version. It uses 80(!) Watt more than 6950X. This chip will land in iMac Pro, and possibly it is the actual reason why Apple was not able to update Mac Pro 6.1 with those CPUs. Thermal core was designed to handle 150W on each side. This is around 175W TDP chip. And this is where we are getting into trouble with power draw. Apple does not have that 500W TDP cooling solution for iMac Pro without a reason.

Performance is in tact where it should be for Skylake architecture, do not get me wrong here. But the power draw is somehow where it should not be for this architecture.

And there are reports from other reviews of high temperatures under load made because of TiM, rather than solder. I do hope that Xeon CPUs will use solder rather than TiM.

Stop making this in AMD vs. Intel vs. Nvidia.
 
Aiden, please, grow up. I have not said ANYTHING about performance. I have said only about power consumption, which funnily enough right now is not important.
But the i9 is more efficient than many of the others - it uses more watts, but is done much faster. Net result - less energy used.
[doublepost=1497647272][/doublepost]
Stop making this in AMD vs. Intel vs. Nvidia.
Stop cherry-picking charts out of context....
 
  • Like
Reactions: tuxon86
But the i9 is more efficient than many of the others - it uses more watts, but is done much faster. Net result - less energy used.
That is correct. Using Geekbench as a point of value. Maybe you have not spotted this, but its video encoding that is loading the CPU to degree that it draws massive amount of power. You have benchmark from Handbrake(video encoding benchmark). I suggest calculating the efficiency yourself, like I did, already. The benchmark of efficiency for this CPU(and others...) looks a "little" bit different.

P.S. This is also an example that Reviews should also be read properly.
 
Well obviously, for Intel fanboys, pointing out weak spots in their beloved hardware must be showing biases.

Where in that post I have posted ANYTHING about performance?

Stop reading things in my posts that are not there. It was not my intention anything else, but to show high power draw. Is it making me AMD fanboy, if I was considering this chip for my builds? You guys, are too biased to be open minded about hardware. I am sorry, but it has to be said to you.

Aiden, please, grow up. I have not said ANYTHING about performance. I have said only about power consumption, which funnily enough right now is not important. Who is now moving the goal post of efficiency?

That is exactly why people do not consider this forum reliable anymore.

And lastly, to all reading: just because you are offended does not mean you are right. I am not AMD fanboy, if anything I have always used Intel hardware(funnily enough I have never used AMD hardware in my life...).

Stop calling me fanboy of AMD. Open your mind. And stop reading in my posts things which are not there.

So you never owned AMD hardware in your life yet you are cosplaying as AMD fanboy? That is even worse my man.

Talk about power consumption, there are other two graphs as well yet you choose to ignore them.

3249aa68-8617-4a0f-93f7-19ec59fc3264.png


216db11b-f76a-4feb-82ae-6a6e928cd209.png


Doesn't look so bad, isnt it?
 
So you never owned AMD hardware in your life yet you are cosplaying as AMD fanboy? That is even worse my man.

Talk about power consumption, there are other two graphs as well yet you choose to ignore them.

3249aa68-8617-4a0f-93f7-19ec59fc3264.png


216db11b-f76a-4feb-82ae-6a6e928cd209.png


Doesn't look so bad, isnt it?
Seriously, you respond to my points with THIS?

Gaming power draw, when your CPU is loaded in what degree? Idle power draw when your CPU is loaded to what degree?

You call me a fanboy and respond with this? Hypocrite.

This whole situation is actually quite funny in context of what has been discussed over past few years, about Efficiency of AMD GPUs.
 
  • Like
Reactions: AidenShaw
Seriously, you respond to my points with THIS?

Gaming power draw, when your CPU is loaded in what degree? Idle power draw when your CPU is loaded to what degree?

You call me a fanboy and respond with this? Hypocrite.

This whole situation is actually quite funny in context of what has been discussed over past few years, about Efficiency of AMD GPUs.

Heh, They didn't even show what degree CPUs were in for video encoding so don't try to argue with me there.

No, it's the fact that it could be spun in any way to support whatever narrative you spin up without showing other results. Reason why I showed other graphs. That power consumption graph alone doesn't tell the whole story yet you only decide to take it out of the context and make it look like that is everything about 7900X.

And Don't even go there about AMD GPUs, if they performed as well as how much power they draw, they wouldn't have much problem competing against Nvidia.
 
  • Like
Reactions: tuxon86
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.