Faster refresh rates should really be the next goal after 4/5k. The 10.5 iPad makes a strong case for its value even in everyday use. It's a lot more compelling than going to 8k.
Plus 8K with such small pixels runs into Nyquest Frequency really quickly, unless the display is much larger than 27 inches.
That is something Apple could achieve with a display that has an integrated eGPU in it (or with iMac (Pro)). It could share 2 or 3 DP 1.3 channels to one display using some exotic MST implementation.Faster refresh rates should really be the next goal after 4/5k. The 10.5 iPad makes a strong case for its value even in everyday use. It's a lot more compelling than going to 8k.
TB3 could also see an update to support at least DP 1.3/1.4.
Check out http://hexus.net/tech/reviews/cpu/107017-intel-core-i9-7900x-14nm-skylake-x/?page=9 , last chart.Aaaaaand we have first review of Core i9 CPU. It is 10 core Core i9 7900X.
http://hexus.net/tech/reviews/cpu/107017-intel-core-i9-7900x-14nm-skylake-x/?page=7
That is quite alarming...
Well DP 1.4 cannot do it (it runs 5K and 8K at 60Hz) so maybe DisplayPort 1.5? Might need Thunderbolt 4 or a better HDMI spec.
Plus 8K with such small pixels runs into Nyquest Frequency really quickly, unless the display is much larger than 27 inches.
Aaaaaand we have first review of Core i9 CPU. It is 10 core Core i9 7900X.
http://hexus.net/tech/reviews/cpu/107017-intel-core-i9-7900x-14nm-skylake-x/?page=7
![]()
That is quite alarming...
And where I have said in that post ANYTHING about AMD?Typical Koyoot AMD fanboyism at its finest lol. Ignoring all other great results 7900X showed on other graphs.
The point of the post was to show one graph out of context that showed that an Intel chip could use a lot of power, while ignoring the fact that it was so much faster than the other chips that it was actually more efficient than many of the lower powered chips.And where I have said in that post ANYTHING about AMD?
Because I pointed out power consumption of the CPU, it is my fanboyism?
People on this forum will never stop to amaze with in absolutely negative way.
I suppose in Intel fanboy eyes I can be fanboy.
P.S. If I would be AMD fanboy I would post the benchmark, which Aiden posted. Would I not? I suppose he is AMD fanboy, because that is what he posted. I did not, why? Because the point of the post was not AMD, but 7900X power consumption.
Well obviously, for Intel fanboys, pointing out weak spots in their beloved hardware must be showing biases.The point of the post was to show one graph out of context that showed that an Intel chip could use a lot of power, while ignoring the fact that it was so much faster than the other chips that it was actually more efficient than many of the lower powered chips.
Hence "Typical Koyoot AMD fanboyism at its finest lol."
Why is the fact that a super-chip uses more power than lesser chips an alarming "weak spot"?Well obviously, for Intel fanboys, pointing out weak spots in their beloved hardware must be showing biases.
It is not a super chip. This is only 10 core version. It uses 80(!) Watt more than 6950X. This chip will land in iMac Pro, and possibly it is the actual reason why Apple was not able to update Mac Pro 6.1 with those CPUs. Thermal core was designed to handle 150W on each side. This is around 175W TDP chip. And this is where we are getting into trouble with power draw. Apple does not have that 500W TDP cooling solution for iMac Pro without a reason.Why is the fact that a super-chip uses more power than lesser chips a "weak spot".
Especially on a day when you post news about 300 watt and 375 watt Vega cards.
But the i9 is more efficient than many of the others - it uses more watts, but is done much faster. Net result - less energy used.Aiden, please, grow up. I have not said ANYTHING about performance. I have said only about power consumption, which funnily enough right now is not important.
Stop cherry-picking charts out of context....Stop making this in AMD vs. Intel vs. Nvidia.
That is correct. Using Geekbench as a point of value. Maybe you have not spotted this, but its video encoding that is loading the CPU to degree that it draws massive amount of power. You have benchmark from Handbrake(video encoding benchmark). I suggest calculating the efficiency yourself, like I did, already. The benchmark of efficiency for this CPU(and others...) looks a "little" bit different.But the i9 is more efficient than many of the others - it uses more watts, but is done much faster. Net result - less energy used.
Well obviously, for Intel fanboys, pointing out weak spots in their beloved hardware must be showing biases.
Where in that post I have posted ANYTHING about performance?
Stop reading things in my posts that are not there. It was not my intention anything else, but to show high power draw. Is it making me AMD fanboy, if I was considering this chip for my builds? You guys, are too biased to be open minded about hardware. I am sorry, but it has to be said to you.
Aiden, please, grow up. I have not said ANYTHING about performance. I have said only about power consumption, which funnily enough right now is not important. Who is now moving the goal post of efficiency?
That is exactly why people do not consider this forum reliable anymore.
And lastly, to all reading: just because you are offended does not mean you are right. I am not AMD fanboy, if anything I have always used Intel hardware(funnily enough I have never used AMD hardware in my life...).
Stop calling me fanboy of AMD. Open your mind. And stop reading in my posts things which are not there.
Seriously, you respond to my points with THIS?So you never owned AMD hardware in your life yet you are cosplaying as AMD fanboy? That is even worse my man.
Talk about power consumption, there are other two graphs as well yet you choose to ignore them.
![]()
![]()
Doesn't look so bad, isnt it?
Seriously, you respond to my points with THIS?
Gaming power draw, when your CPU is loaded in what degree? Idle power draw when your CPU is loaded to what degree?
You call me a fanboy and respond with this? Hypocrite.
This whole situation is actually quite funny in context of what has been discussed over past few years, about Efficiency of AMD GPUs.