The latest Intel CPUs are by definition the cutting edge CPUs. Intel releases dozens of CPUs every year (compared to Apple's two). Apple waits quarters (and sometimes years) to pick up the latest ones. The reasons are different but always lame: redesign of motherboard would be required, redesign of the case would be required. So? Redesign it. Apple has its own ideas. They stick with the same case design for years and wait until "proper" CPU shows up in Intel line-up.
And yes, the best GPUs are NVIDIA GPUs right now. For some reason, Apple prefers the inferior GPUs from AMD.
Intel is cutting edge? Really? Seriously? Carrying around the cruft of x86 compatibility like an albatross around their necks. Their greatest asset is also their greatest burden. It is also the stick that they beat Microsoft to death with in order to keep Windows from moving to something that would strangle Intel's profit to death. How I wish Microsoft had some courage right about now.
As for real cutting edge, Apple is shipping a 7nm SoC
today, not promising 10nm for holiday 2019, maybe. AMD just announced their 7nm node today, along with Zen 2, EPYC Rome w/PCIe 4.0 and 7nm Vega Instinct GPUs while Intel continues to struggle over 10nm two years after it was promised, with all indications that it may actually never see the light of day, the Core i3-8121U not withstanding.
The same Intel that panicked after the debut and success of AMD's Ryzen and was forced to move up their timetable for adding cores to their consumer CPUs because the performance gains from new microarchitectures are now practically non-existent.
The same Intel that had to be brow beaten by Steve Jobs to really start considering performance per watt which lead to a mini-Renaissance for Intel and benefitted all personal computer users.
The same Intel that made its own GPU engineers lobby for space on the die in order to give us the Iris Pro Graphics 5200, an integrated GPU that could hold its own against the GeForce GT650M, a discrete GPU -
https://www.anandtech.com/show/6993/intel-iris-pro-5200-graphics-review-core-i74950hq-tested - and then squandered that talent while it chases NVIDIA trying to keep them away from its juicy Enterprise customers.
The same Intel who with a seemingly endless portfolio of Xeon CPUs seems to be losing mindshare to AMD's EPYC.
The same Intel that is too stubborn to integrate an LPDDR4/x memory controller on its 14nm CPUs (at least mobile) leaving PC OEMs to use DDR4 and Apple twisting in the wind so much so that they had to relent and allow DDR4 on the 15" MacBook Pro motherboard to try and appease customers while Intel blew through their 10nm promises.
None of that sounds cutting edge to me, but I digress.
As for GPUs, you can have your NVIDIA GPUs...I dislike NVIDIA as a company and although I empathize with others who must use them day to day because of CUDA, that is as far as it goes for me. The 20x0-Series offers a modest increase (~20%) in performance over the 10x0-Series GPUs and that is great, but until developers embrace ray-tracing technology, it is all still a work in progress and leadership is subject to change hands on any given day.