Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

JronMasteR

macrumors 6502
May 4, 2011
327
126
Switzerland
"Every app that can run tasks in parallel or can split task onto multiple threads benefit from more cores. However, that doesn't mean that clock speeds don't matter.
Get a 2990WX and clock it to 2ghz vs. 3ghz and check which is faster... Seriously, is that even a question."
Faster lol. I guess you know nothing about multi-cores CPU.

"And no, 9900k's do not suffer from overheating issues. They run well within spec. You don't need a Watercooler at all if you run them stock or with a slight overclock. If you overclock them to the limit and put AVX loads on it, yeah, it gets very warm. But also, I'm talking prime 95 torture tests.
It's warmer than its 6 core predecessor, sure, because cramped 2 additional cores in there. But just because it runs warm, does not mean its overheating."
Are you living in this world? There are tons of videos and links about 9th gen CPU issues. Go check them

"And which game requires 6-10 cores? Games almost always rely on performance per core. So modern quad cores run every game just fine. Check benchmarks. And 4k Games are 99% limited by the gpu... 1080p and 1440p high refresh rate enthusiasts with the best gpu's get bottle necked by the cpu."
Both PlayStation 4 and Xbox support 8 cores already. For 4K games, it is necessary to have 8~10 cores. Division 2 is an example. Battlefield 5 uses all 8 cores.

So, you are saying the 2ghz clocked 2990WX will perform better than the 3ghz clocked one... Sure, are you living in this reality?

Watched all the videos needed about 9th gen cpu's and I am using one, so I got all the experience I need. You are just stating what some youtuber's said...

Really, consoles? Because consoles have 8 cores, games require 8 cores? Seriously, are you living in the same world? 4k gaming requires 8 cores???
 

mavericks7913

Suspended
Original poster
May 17, 2014
812
281
So, you are saying the 2ghz clocked 2990WX will perform better than the 3ghz clocked one... Sure, are you living in this reality?

Watched all the videos needed about 9th gen cpu's and I am using one, so I got all the experience I need. You are just stating what some youtuber's said...

Really, consoles? Because consoles have 8 cores, games require 8 cores? Seriously, are you living in the same world? 4k gaming requires 8 cores???

"So, you are saying the 2ghz clocked 2990WX will perform better than the 3ghz clocked one... Sure, are you living in this reality?"
You dont even think about the core number. Are you joking?

"Watched all the videos needed about 9th gen cpu's and I am using one, so I got all the experience I need. You are just stating what some youtuber's said..."
Too bad since you cant check all data from google.

"Really, consoles? Because consoles have 8 cores, games require 8 cores? Seriously, are you living in the same world? 4k gaming requires 8 cores???"
HAHAHA. Do you even built computers and play 4k games? 4K requires 8~10 cores, 11gb of RAM, 11gb of VRAM, and at least 2080TI. WTH are you talking about? You seriously know nothing about true 4k games at all. Did you even check the requirement for 4K games?

This is the requirement for 4k 60p to play Divison 2.
  • OS: Windows 7 | 8 | 10
  • CPU: AMD Ryzen 7 2700X | Intel Core i9-7900X
  • RAM: 16 GB
  • GPU: AMD Radeon VII | Nvidia Geforce RTX 2080 TI
  • VRAM: 11 GB
  • DirectX: DirectX 11 | 12

This is why newest games start supporting at least 6 cores and for 4k games, it requires 8~10 cores.
 
Last edited:

JronMasteR

macrumors 6502
May 4, 2011
327
126
Switzerland
Yeah, that game definitely needs 10 cores... Especially at 4k...
upload_2019-4-30_20-58-20.png

https://www.guru3d.com/articles_pages/the_division_2_pc_graphics_performance_benchmark_review,7.html
 

mavericks7913

Suspended
Original poster
May 17, 2014
812
281
Told ya.
[doublepost=1556679080][/doublepost]
I have yet to see a game requiring 10 cores. Link?

[edit] never mind, this BS has already been refuted.
I guess you dont know how to calculate the requirement. You need extra 2 cores for free space.
[doublepost=1556679153][/doublepost]
But Intel also has CPUs with high clock speeds - and as far as being specific, how can I be more specific than quoting the exact part number and specs from http://ark.intel.com ?
Dont be ignorant. I told you that both Intel and AMD have slow CPU and you ignored it.
 

Kpjoslee

macrumors 6502
Sep 11, 2007
416
266
Game resolution itself has nothing to do with number of CPU cores, it is entirely up to the GPU fillrate. Benchmarks clearly shows that performance doesn't scale up beyond 6 cores, because they all become GPU bound.
 

mavericks7913

Suspended
Original poster
May 17, 2014
812
281
Game resolution itself has nothing to do with number of CPU cores, it is entirely up to the GPU fillrate. Benchmarks clearly shows that performance doesn't scale up beyond 6 cores, because they all become GPU bound.

Doesn't explain why Xbox and PlayStation are using 8 cores.

Do you see how much CPU performance it requires? The 4K resolution requires not only GPU but also CPU power.
 

Kpjoslee

macrumors 6502
Sep 11, 2007
416
266
Doesn't explain why Xbox and PlayStation are using 8 cores.

Do you see how much CPU performance it requires? The 4K resolution requires not only GPU but also CPU power.

It is not about the 4k resolution, it is about the scene complexity and amount of physics that impacts CPU usage. Ability to scale to multiple cores =/= requiring specific number of cores. Don't even argue about Xbox and PS4's 8-core Jaguar processor since they are considerably worse than even 4-core Broadwell.

You just keep bringing sources after sources but keeps doing incorrect analysis to support your claims.
 

deconstruct60

macrumors G5
Mar 10, 2009
12,286
3,882
Intel canceled 10nm for this year and 14nm will remain until 2021. Btw 14nm used since Broadwell micro-architecture from 2014~2015.

No, technically they did not cancel 10nm for this year. If bothered to actually look at the multiple charts in the tweakers article ( the watermark slapped on this. or got through the regurgitation at wccftech ) you get to this chart.

2002679906.png


https://tweakers.net/nieuws/151984/...l-in-2021-nog-desktop-cpus-op-14nm-maakt.html

See the purple swimlane labeled 10nm .... yep swimlane starts in 2019.

Now the "doom and gloom" is more so in the desktop and Xeon W / SP space. 10nm there was not promised by Intel for 2019 (in the last 6-10 months. Nevermind about roadmaps from 2+ years ago. )

This chart is just desktop.

2002687634.png


https://tweakers.net/nieuws/152112/...tot-eind-2020-bevat-geen-10nm-processors.html

The quirky part of this roadmap is that "Cascade Lake Refresh SP" ... when Intel has openly tagged the iteration after Cascade Lake as Cooper Lake. If this is a 2018-2020 timeline, just when in 2018 ( or perhaps late 2017) was this slide-ware prepared? The earlier in 2018 the date ( or even more so late 2017), then the more suspect it is with the current state.



. Now, 10nm is gone base on their CPU roadmap.

No it isn't even from the source you are trying to pull from.

10nm is slow to make ( starting making these in 2nd quarter and won't see systems coming to market until about Q4. ) and appears they have a bigger issues with larger dies. (mobile only)

https://www.anandtech.com/show/1427...tion-of-10nm-ice-lake-cpus-raises-10nm-volume

Intel already are incuring significant expense making stuff that they will sell later.

Furthermore, A document leaked from Cicso ( as opposed to this one from Dell) was tagged as a December 2018 orign.

" The document detailing Cisco’s Unified Computing System, was published in December 2018 for Cisco ..."
https://www.anandtech.com/show/1393...ade-lake-cooper-lake-and-ice-lake-for-servers

This relatively new document mentions Cascade , Cooper , and ice lake in the time frame. of 2019-2020. This other Dell document has no Ice Lake, so how current could it be? No Cooper , no Ice Lake. Intel talked about both of those in the December Arch reference day. So is the Dell document probably isn't more recent.

Can Intel make Ice Lake work as a huge monolithic chip die with their 10nm process. Perhaps not. but it doesn't have to be monolithic inside a bigger package with more pins.

There is a pretty good chance that the Desktop swimlanes in an updated document look like the mobile one where Intel will have both 10nm and 14nm products overlapping in some of the major product groupings. S


Even Jim Keller mentioned that Intel needs to new CPU architecture like AMD Ryzen and it will take more than 3 years.

Again context? Keller saying there was nothing to do for 3 years or that over the span of 3 years there would be more refinements/incremental updates. Which really would not be abnormal. There should be increments past Sunny Cove which which will ship this year in the mobile units.
 

mavericks7913

Suspended
Original poster
May 17, 2014
812
281
It is not about the 4k resolution, it is about the scene complexity and amount of physics that impacts CPU usage. Ability to scale to multiple cores =/= requiring specific number of cores. Don't even argue about Xbox and PS4's 8-core Jaguar processor since they are considerably worse than even 4-core Broadwell.

You just keep bringing sources after sources but keeps doing incorrect analysis to support your claims.

The 4K resolution itself requires more compute power including both CPU and GPU. Try yourself with 4 cores CPU. The video that I attached already prove that more cores is necessary. I have no idea why you are saying that CPU core is not important especially with 4K games. With newest AAA games support more than 6 cores in these days.

You are the one who is not understanding about the 4k game requirement.
 

Kpjoslee

macrumors 6502
Sep 11, 2007
416
266
The 4K resolution itself requires more compute power including both CPU and GPU. Try yourself with 4 cores CPU. The video that I attached already prove that more cores is necessary. I have no idea why you are saying that CPU core is not important especially with 4K games. With newest AAA games support more than 6 cores in these days.

You are the one who is not understanding about the 4k game requirement.

That has to do with games with more complex IQ and physics and not the resolution itself. 4k resolution itself is nothing special since you can scale up 10 year old game with 4k resolution and still end up being GPU bound than CPU. Those video shows that games can scale beyond 4-cores and distribute nicely with 8 cores or maybe more, but that doesn't show that games are requiring more than 4-6 cores at the present.

If the game stresses 8-cores at more than 90% most of the time, then it would mean that the game requires 8-cores or more, but it is not showing that since it barely stresses more than 60% most of the time.
Yes, it is good to have more than 6-cores for games since it still has extra CPU room for other tasks including streaming, but no, it is not the indication that game "requires" 8-cores.
 
Last edited:

mavericks7913

Suspended
Original poster
May 17, 2014
812
281
That has to do with games with more complex IQ and physics and not the resolution itself. 4k resolution itself is nothing special since you can scale up 10 year old game with 4k resolution and still end up being GPU bound than CPU. Those video shows that games can scale beyond 4-cores and distribute nicely with 8 cores or maybe more, but that doesn't show that games are requiring more than 4-6 cores at the present.

If the game stresses 8-cores at more than 90% most of the time, then it would mean that the game requires 8-cores or more, but it is not showing that since it barely stresses more than 60% most of the time.
Yes, it is good to have more than 6-cores for games since it still has extra CPU room for other tasks including streaming, but no, it is not the indication that game "requires" 8-cores.

lol it does. Stop ignoring the fact that recent games now support 8 cores especially for 4K games. Both Xbox and Playstation set the requirement at 8 cores already and yet you dont care about this.

Also, that video testing is base on 1080P. Not even 4K. Do you now realize why 8 cores is required for the 4K gaming?
 

Kpjoslee

macrumors 6502
Sep 11, 2007
416
266
lol it does. Stop ignoring the fact that recent games now support 8 cores especially for 4K games. Both Xbox and Playstation set the requirement at 8 cores already and yet you dont care about this.

Also, that video testing is base on 1080P. Not even 4K. Do you now realize why 8 cores is required for the 4K gaming?

Please stop, your lack of knowledge on this subject matter is showing. Shame on me for keep bothering to respond back at your posts. :eek:
 

frou

macrumors 65816
Mar 14, 2009
1,295
1,787
Apple would most likely have to do a lot of work on their OS kernel to get every aspect of the Mac running bulletproof on contemporary AMD CPUs and Chipsets.

I don't see them considering that being worth the hassle.

What would be worth the hassle is moving to in-house chips once-and-for-all as and when it becomes practical for each product line.
 
Last edited:

Fl0r!an

macrumors 6502a
Aug 14, 2007
909
530
Also, that video testing is base on 1080P. Not even 4K.

That's actually the point. At 4K the CPU load would be way lower, because almost any system will be GPU bound. You'd barely see a difference between an 4C and 6C CPU, not even talking about 8-10 cores. This has been proven in countless benchmarks, e.g. this one: https://forums.macrumors.com/thread...ll-or-after-2021.2179645/page-2#post-27325237

The number of pixels generated by the GPU doesn't affect the CPU load at all! It just depends on scene complexity, physics, refresh rate, ...

However, at low resolutions such as 1080p the GPU load decreases, which allows for higher framerates, moving the bottleneck to the CPU.

Doesn't explain why Xbox and PlayStation are using 8 cores.
Reason is simple: Those ancient Jaguar CPUs are dead cheap, way cheaper than an Intel CPU with higher IPC. So the obvious choice was to optimize the games for 8 cores to save a few $$.

Btw, refresh rate on consoles is very low, which decreases the CPU load heavily compared to those >100FPS in your video.
 
Last edited:
  • Like
Reactions: JronMasteR

deconstruct60

macrumors G5
Mar 10, 2009
12,286
3,882
That's actually the point. At 4K the CPU load would be way lower, because almost any system will be GPU bound. ...

However, at low resolutions such as 1080p the GPU load decreases, which allows for higher framerates, moving the bottleneck to the CPU.

The video also seemed indicative that these were DirectX 11 games. That is the first iteration were Microsoft started enabling more multi threaded API. (there were more differences to measure so more cores became more of a 'hot'/'trendy' thing. ) DirectX 12 does some reassignments which further uncorks the CPU driver bottleneck and enables more parallelism if can find it in the games infrastructure.


Reason is simple: Those ancient Jaguar CPUs are dead cheap, way cheaper than an Intel CPU with higher IPC. So the obvious choice was to optimize the games for 8 cores to save a few $$.

Not only cheap but also same DirectX 12 / Mantle / Vulkan / Metal shifted abstraction for drive's CPU component that cuts down the overhead.

Where the GPU pipeline fillign gets more efficient that opens door for these "weaker' cores to do more physics, other computation, and sytem/OS admin/overhead with the "extra' cores while keeping latencies more predictable.
 
  • Like
Reactions: JronMasteR

deconstruct60

macrumors G5
Mar 10, 2009
12,286
3,882
old (probably dated) slide deck from Tweaker places Tiger Lake in 2021. Intel places it in 2020.

https://www.tomshardware.com/news/intel-tiger-lake-10nm-2020,39299.html

There is some power consumption drift upward here. ( probably a contributing reason that the 14nm will be around for a while into 2021. )

That doesn't necessarily 'bring in" the next Xeon W or SP timeline by several quarters, but not necessarily a slide past 2020. [ and another update that the slide deck leaked was dated. ] Perhaps not a total, across the board replace replacement for the whole W line up but decent chance Intel will have some 'cherry pick', 10nm updates/replacements in 2020 time frame. Whether Apple grabs any of those is a probably more muddled that what Intel is doing. (Apple is still wandering in the dark in 2019. )
 

Ph.D.

macrumors 6502a
Jul 8, 2014
553
479
I have a 9900k running with a one-click overclock to 5GHz on all cores, using a simple air-cooled setup (Noctua NH-U14S).

Running the absolutely most-aggressive "power virus" program does result in maxing out the CPU's temperature (100C - don't do this for long). However, literally anything else that does real-world work, e.g. all 16 virtual cores running flat out doing AI training for hours, etc., the system runs remarkably cool.

Those who claim that these CPU's "have problems" with heat are often fetishizing abusive benchmarking and unrealistically projecting that onto real world results. Instead, the actual real-world result, in my experience, is cool, composed, superbly-high performance with no temperature issues or instability even with the 5GHz overclock.
 
  • Like
Reactions: Alex Sanders74

Slash-2CPU

macrumors 6502
Dec 14, 2016
404
267
That is the Intel Client Commercial Roadmap for SIPP!

The reason it looks odd is because you're looking at the wrong sheet. That's also why Intel has it marked confidential, because ignorant people will misunderstand it.


It's not the Desktop Client Roadmap. Huge difference. SIPP is for long-lifecycle products. Think library PC's or point-of-sale kiosks or 500+ unit deployments where an institution wants to be able to purchase 50 more identical systems next year. This is also military systems, process control, industrial PC's, and such where the computer is a part of a much more complex system that is sold to the end user as a system that happens to contain that computer. It is a much, much slower product cycle that leaves the majority of Intel's CPU SKU's off the order sheet and sometimes even skips refreshes.


Go look up the latest desktop client roadmap. It looks completely different, as it should. SIPP clients aren't concerned as much about nm or core count or raw performance. They have a need for a Stable Image Platform Program, so they can build and maintain a full scale deployment or a full solutions kit that happens to include an x86-64 system.

In other news, school buses receive less frequent technological updates than Teslas.
 
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.