Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Good luck with battery life with a chip that draws 18 amps!

This is why I don't get why Intel has acted for over a decade like they hate the desktop. It's one of the few areas they were good at, and few others were competing at, and they basically just quit caring.

Quick edit: ha I missed the joke there too but still.
 
  • Love
Reactions: segfaultdotorg
I was originally not a fan of moving to Intel from the G5. I had really high hopes for the future with Apple and IBM. But the transition was better than I anticipated it to be. And was pretty happy with my Xeon MacPro and MacBook Pro 16" laptops. Until, they became heat boxes and helicopters (fans). At that point I almost wished they would have switched or ADDED AMD to the mix. But that was before any of us really knew about their plans to move to the M chips.

IF intel can make a good processor again. Something that's not a heat box AKA helicopter. I can only hope they do the same for the Xeon line and continue to improve going forward. Offering another GPU option is definitely needed. So having a really good processing node would be a boon for them. But they can't fail again. They have zero margin for error on this. I wish them luck, and hope that Apple uses them for chip production in the near future.
 
Does anyone know if these are still using the x86 instruction set ?
I would think so with the “core” product name.
But wouldn’t that put them behind the start line in performance ?
 
It's all about yields and performance at this point. It took a long time for Intel to build out the new fab in Chandler, at the opposite end of the Phoenix metro from the TSMC plant in Anthem. I hope they succeed even if they don't put out many Apple processors.

See my above post and link. 18A is already a failure. Only Intel is using it. Chip designers rejected it including Nvidia and reportedly Apple.
 
Does anyone know if these are still using the x86 instruction set ?
I would think so with the “core” product name.
But wouldn’t that put them behind the start line in performance ?
Yes. Intel and AMD are still on x86. In fact, Intel recently proposed a clean sheet redesign for x86 and the industry rejected it.

Now Intel has to sit on a committee made up of the hyperscale giants like HP and Google along with its competitor AMD to hash out the future of x86. Pretty sad that the inventor of x86 no longer calls the shots. That’s how far Intel has fallen.

 
Regular consumers probably dont know or care but among PC Gamers, they've burnt away most of their goodwill. Too frequent socket changes, the 13th and 14th gen design flaws, the CPU's requiring obscene levels of power. AMD is king at the moment with their X3D series, AMD kept the same CPU socket relevant for a decade and the socket after that looks to be doing the same.
Yeh, gamers need to dedicate all available power and cooling in their machines to inefficient NVidia cards.
 
Please, please, please Apple make a pro system that uses intel! My software requires intel. I use 3d studio max and vray (amongst other plugins) and they run like a dream on parallels on my 27” iMac - such a cool design too!
Or you can prod Autodesk for a Win-On-Arm version of 3D Max, which they've been dragging their feet on for far too long. As re V-Ray, it supports Apple Silicon just fine. You can use e.g. V-Ray for C4D on macos, in both cpu and metalRT modes (since v7). https://support.chaos.com/hc/en-us/articles/4412218197265-Can-I-use-V-Ray-GPU-on-macOS

It's Autodesk's problem, not Apple's.
 
Is that 27 hours of standby battery life?

They have been promising all day battery life for more than a decade. Every single PC laptop I have owned somehow managed to last 2 to 3 hours doing absolutely nothing. Granted, they're H series chips, so they're quite power hungry, but the U series chips are a complete joke for dev work.

Show me something that competes with the M1 Pro in real world performance per watt.
 
  • Like
Reactions: Wx_Man
While I agree that ARM is more power efficient, let’s not forget the OS. Intel and AMD’s real problem is the Power in-efficiency of Windows and ESPECIALLY the Teams APP that most windows users use. An hours use of Teams combined with it running in the background will more than halve your battery life.
 
  • Like
Reactions: ProbablyDylan
First it was Matt Damon and Ben Affleck together in a new movie, then we found out we'll get a new Avengers movie with mostly the same actors, then Solskjäer was named the coach of ManU again, and now Intel chips might be going into future Apple products again

maybe it's just me, but sure feels like the theme of 2026 is "old is new again" and we're fresh out of, well, fresh ideas
 
  • Like
Reactions: ProbablyDylan
Or you can prod Autodesk for a Win-On-Arm version of 3D Max, which they've been dragging their feet on for far too long. As re V-Ray, it supports Apple Silicon just fine. You can use e.g. V-Ray for C4D on macos, in both cpu and metalRT modes (since v7). https://support.chaos.com/hc/en-us/articles/4412218197265-Can-I-use-V-Ray-GPU-on-macOS

It's Autodesk's problem, not Apple's.
You’re right. Autodesk wouldn’t be as obliging as Apple, they have a history of making great (intel) machines. I’ll never go back to pc, I use many apps that run better on Mac OS (adobe, fusion, final draft etc).
 
Chip doesn’t draw 18 amps, it is an 18 Angstrom process which in nm is 1.8nm process. Has to do with circuit size not power draw.

Well.... it might (actually, 100% WILL) draw more than 18 amps if it is in line with recent intel chips when boosting....

18 A at 1.1V around where most CPUs live, voltage wise is not much power. intel recently have been boosting up beyond 250 watts which... at 1.1 V is a lot of amps!

from chatgpt...


---

Use the basic power equation:


P = V × I


So:


I = P / V = 250 W / 1.1 V ≈ 227.3 A


Bottom line​


To draw 250 W at 1.1 V, you’re looking at ~227 amps.


That’s an enormous current, which is why:


  • At these voltages you must use very short, very wide conductors (or planes)
  • Even tiny resistance causes big losses and heat
  • This is squarely in CPU/GPU VRM territory, not something you run over wires casually

If you want, I can also break down real-world current including VRM efficiency (e.g. 90–95%), which pushes this even higher.



--- back to me ---


18A power draw from intel would be 2 orders of magnitude improvement!! Some of their stuff is set up for 300 to 400 amps!
 
Last edited:
I'm very glad they've brought back the XPS line. Their lineup was so confusing this past year! We replaced several laptops at the office and no one got a Dell this year.

I looked at the 14" and 16" models today, and then I read about the 13" making a comeback. They look like nice machines, albeit pricey! It'll be interesting to see what they do with the XPS 13.
Agreed. With Dell prices skyrocketing like they think they're Apple, the worst re-brand in history, and absolutely terrible customer service, I'm moved on to recommending Lenovo. While the XPS 13 looks like it checks all the boxes, lets wait and see what Lenovo and HP do with Ultra Series 3 and if they can offer the same specs with lower prices.
 
  • Like
Reactions: _Spook_
Agreed. With Dell prices skyrocketing like they think they're Apple, the worst re-brand in history, and absolutely terrible customer service, I'm moved on to recommending Lenovo. While the XPS 13 looks like it checks all the boxes, lets wait and see what Lenovo and HP do with Ultra Series 3 and if they can offer the same specs with lower prices.
We've got a few more machines to replace for people this year. Thanks for putting some other options on my radar! (We can't use Macs at the office, alas.)
 
Yes. Intel and AMD are still on x86. In fact, Intel recently proposed a clean sheet redesign for x86 and the industry rejected it.

Now Intel has to sit on a committee made up of the hyperscale giants like HP and Google along with its competitor AMD to hash out the future of x86. Pretty sad that the inventor of x86 no longer calls the shots. That’s how far Intel has fallen.


AMD enhanced to x86-64 first. ( Intel had some internal 64-bit alternatives but moved slower. And no this wasn't Itanium. That effort was meant to kill off the 64 RISC server. chips.. not a commodity PC chip path. Largely did. Power and a small amount of Sparc left. ).

Intel hasn't 'fallen' as much as there is more competition now. Intel can't use x86 'improvements' as a tool to take market share away from AMD. One of x86's ecosystems problems is that is new stuff has been been thrown in ( AVX-512 in Intel server options but not in lower end options ) . One of the problems with 'x86' is that it is consitpated with layers of additions meant at least as much for gap forming as end user enhancements. ( 3 or 4 different SIMD instruction solutions ). It is a dual edge sword selling 'maximum backwards' compatibility and trying to keep up with bleeding edge holding a boat anchor.

x86 never was inherently better than the other options. Intel wasn't entitled to being supreme dictator in the first place. Taking input from a their customers is where they always should have been. Similarly, fragmenting AMD always did fragment the x86 space.

I don't think Intel did a 'clean sheet' proposal. It more so retire (dump) some of the ancient legacy stuff (that is mostly boot/security baggage) and undo someone of the fragmentation they had layered in. More de-bloating x86 than starting over. What was rejected is not doing this without closely coordinating with AMD. Clean up but not primarily as a tactic for saving market share in x86. It should be done to 'save the x86 platform'.

Intel invented x86 , but they also licensed x86. The latter means they have platform responsibilities.
 

Apple doesn't use Intel chips anymore, so the new Core Ultra Series 3 processors will be exclusive to PCs, but there are rumors that Intel could manufacture some Apple chips in the future. According to Apple analyst Ming-Chi Kuo, Intel will make lower-end M-series chips for Apple's Macs built on the 18A process, using Apple chip designs. Intel could begin shipping chips to Apple as soon as mid-2027.

Sigh. Even though previous threads pointed out this is incorrect ... keep coming back with the same flawed reporting.
Apple doing 18A in 2027 makes so sense. Indeed, if go to actual Kuo reporting.

"... Apple previously signed an NDA with Intel and obtained the advanced-node 18AP PDK 0.9.1GA. ..."

Apple would be using 18AP; not 18A. If Apple got the 18AP design kit, then that is far more likely what they would be using.
 
  • Like
Reactions: TenSeven
Nvidia tested 18A and isn't moving forward with using it.

Maybe it's good enough for Apple though?

Apple isn't using !8A. This is mainly sloppy reporting by Mac Rumors. Kuo's report was that it was 18AP that Apple got a design kit for; not 18A.

Nvidia is working with Intel on mulitple chiplet solutions. If not working with 18AP very high probability that they are working with Intel on EMiB/Foveros solutions through Intel fabs. Broadcomm was evalutating 18A. Nvidia has very substantively large networking business also. Same reasons why Broadcomm was doing eval could be the same for Nvidia. Nvidia can't both be a very deep SoC partner with Intel and also 100% ignore all of Intel's fab technology.
 
Please, please, please Apple make a pro system that uses intel! My software requires intel.

That runs what? Apple has explicitly announced that macOS on Intel is dead as far as new releases go


Future Macs need new operating updates during the time they are released. macOS 26 Tahoe has Intel support, but at this point it is announced as a 'dead end'. Maybe 3 more years coming of just security updates and some 'dead end' bug fixes. Who is going to buy a $6K machine with an end date like that on it. Some 'lunatic fridge' perhaps, but in substantive volume? no.

There are no current drivers for era GPUs to go with it either. 2-3 year old GPUs? That is going to promote widespread sales? Nope.

The T2/T-series chips are basically dead also. Apple's UEFI implementation .. dead.

The Apple macOS library stack largely redacted here also ( Rosetta is being pulled back over next 3 years also).

Very good chance that kernel extension API pragmatically dies when MacOS on Intel dies. (at least for 3rd parties).

There is a large software stack that is being dropped. Parallels makes macOS calls for certain critic functionality. If that substrate is gone then have real problems.

No supported software stack pragmatically means what you have is a hackintosh. If you want a hackintosh just make one with the dead end Tahoe. Apple isn't going to waste any more time directly filling the hackintosh market space now than they did during the Intel era.


I use 3d studio max and vray (amongst other plugins) and they run like a dream on parallels on my 27” iMac - such a cool design too!

3D studio Max is not a macOS application. Apple is going to do double somersault backflips to enable a non macOS app to run? Apple is 99.9% focused on running Macs (and MacOS ). That is it.
 
  • Like
  • Love
Reactions: _Spook_ and Wx_Man
Lol. guessing they're going to be the cellular modem chips. Its about all i'd trust intel with just yet.
The cellular chips are too important to trust to a company that can’t hit power and efficiency targets. It’ll probably be some unimportant USB support chips or something like that.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.