Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Who, the **** cares about Power Efficiency in High Performance Computing sector?

These people for a start: https://www.engadget.com/2018/06/18/astra-arm-supercomputer/

Plus anybody with enough high-school physics to understand that power consumption = heat output, and even if you don't give a fig about the electricity bill, keeping electronics cool is a major design problem and a big constraint on how much power you can fit into a given space.
 
Very interesting news, this would be most excellent for the 13” MacBook Pro if it includes a nice GPU in the AMD SOC, well more powerful then Intels anyway.
 
  • Like
Reactions: turbineseaplane
These people for a start: https://www.engadget.com/2018/06/18/astra-arm-supercomputer/

Plus anybody with enough high-school physics to understand that power consumption = heat output, and even if you don't give a fig about the electricity bill, keeping electronics cool is a major design problem and a big constraint on how much power you can fit into a given space.

And like all performance metrics there is a sweet spot between performance and energy efficiencies. Those $/Watt matter even at the Superscalar level.
 
  • Like
Reactions: MikeZTM
So they had to do this weird methodology to pitch it against Xeon. Totally believable.

Because even that Xeon has HyperThreading which will still make this CPU faster.

Dude, stop this circlejerk about ARM. 32 cores at 105 TDP. It tells you everything you should know about this idea. And no, 32 core CPU will not be faster than Intel 250W TDP CPUs, because of two reasons: core clocks and HyperThreading. That power budget will always pay dividence.

No matter how much you believe into something will not change the fact that it is not true, especially - in real world.

we are talking about CPU not space heater.
32 cores and each core is faster and use half power is a fact.

Pentium 4 Using 100w is not faster than a 15w core i7

Oh and AMD is much more power efficient than Intel that’s pretty obvious. If 200w AMD beating 300w intel is a fact that you can believe why 100w arm beating intel so unbelievable for you?

And SMT isn’t a computational performance .
It is a optimization method to fully utilize single core resources when super scaler can not do that automatically.

That’s a useful feature when your cpu pipeline have too many “bubbles”. When arm getting same bubbles they will implement SMT too.

They don’t need it right now doesn’t give it any disadvantage against Intel.
 
Last edited:
I can only imagine how much the Intel people are freaking out right now.

They already know.
[automerge]1581118902[/automerge]
Currently even if the server market wanted to completely transition to AMD it wouldn't be possible in reality due to production capacity limits of TSMC, even if they are increasing it as fast as they can.

But AMD could probably handle something like supplying the "small amount" of SKUs that would be needed for Macs...

TSMC and Samsung.
[automerge]1581118927[/automerge]
What if the next Apple processors will include several Arm cores and some AMD cores?

It won't other than the T2 Processor.
 
No they don’t. They only have them minimally beat for games at 1080p. Anything 2K and higher ... the difference is almost non-existent. But yes they have them beat on single core, but not by much. A lot of things are moving to multi-threaded now that AND finally pulled their heads from their arses.
You said no they don’t and then yes they did, and the explained that they beat them but not by much, so what is it then?
 
You guys aren't correct about Thunderbolt 4 (anymore).

Since Intel is such a pinnacle of innovation right now they "just" change numbers in the names of products/technologies to suggest something more powerful is coming your way.

Thunderbolt 4 is no longer using PCIe 4.0 x4, it's just Thunderbolt 3 meaning PCIe 3.0 x4 and DisplayPort 1.4 - but with a more powerful USB controller that handles 20 Gb/s instead of 10 Gb/s (as it is right now) USB devices.

This means you won't see any performance increases with Thunderbolt 4 if you are using an actual Thunderbolt device, only if you use a Thunderbolt 4 computer with a 20 Gb/s USB device. But you can just get a dedicated USB 20 Gb/s PCIe add-in card to achieve this performance with a computer that already has Thunderbolt 3.

It's pretty sad.

I hope Apple is actually switching to AMD CPUs and Ryzen 4000 APUs and isn't just using internal macOS builds without the digital locks that prevent it from running out of the box on AMD hardware to just increase pressure on Intel to further push down the prices for Intel CPUs and chipsets ("Look how much faster macOS runs on Zen 2-based AMD hardware, why should we pay you amount x for your over-priced and out-dated stuff where we have to develop new security fixes each month that further reduce performance?").

Intel's only current solution is to (indirectly) dump cash on its OEM customers.

There is no Thunderbolt 4. It's Thunderbolt 3.
 
  • Like
Reactions: xnu
Apple should have bought AMD a couple of years ago when they were at $2 per share. Would have given them even more control over the Macs graphics and now (possibly) processors.
I’m pretty sure there is a deal with Intel’s x86 licensing that says if AMD is sold, the license is voided. It essentially prevents AMD from ever being bought by anyone.
 
  • Like
Reactions: Andres Cantu
Yeah, and Hackintoshes were running on AMD CPUs for past 3 years because of the Hackintosh Community redesigned themselves the code, eh?

No. AMD and Intel use the same ISA. There is no redesign, you just switch the CPU vendors, and the instructions are the same.

The reason why there is AMD code in Apple Catalina beta is because Apple is readying AMD CPU based Mac(Macs) and need software to be able to test it. There is no better way to test the stability of the OS, and software, than the cheap, old APUs: PIcasso, Raven Ridge, etc. AMD Based Macs will use most likely Renoir, Van Gogh, and Zen 2 based Ryzen CPUs.
They can run the OS. Can they do it up to Apple standards?
 
Erm..Google Arm Supercomputer....you are wrong.

There is one in the top 500 Supercomputing List


Astra - Apollo 70, Marvell ThunderX2 ARM CN9975-2000 28C 2GHz, 4xEDR Infiniband
RANK: #198
Software
Power Consumption
Performance
Site:Sandia National Laboratories
Manufacturer:HPE
Cores:143,640
Memory: 331,776 GB
Processor:Marvell ThunderX2 CN9975-2000 28C 2GHz
Interconnect:4xEDR Infiniband
Linpack Performance (Rmax)1,833 TFlop/s
Theoretical Peak (Rpeak)2,298.24 TFlop/s
Nmax5,220,096
HPCG [TFlop/s]90.9
Power: 1,192.54 kW (Submitted)
Power Measurement Level:3
Operating System:Tri-Lab Operating System
Compiler:armclang 19.1.0
Math Library:OpenBLAS 0.2.20+
MPI:HPE MPI 2.19


On average it requires 3 times the number of processors and it is from CHINA
[automerge]1581120318[/automerge]
@mdriftmeyer

Intel teased/announced Thunderbolt 4 with a misleading presentation at CES 2020 back in January.


Also, only TSMC currently manufactures Zen 2 7 nm chiplets that are the basis of every AMD CPU that kicks Intel's butt.

That's because AMD signed an exclusive contract with TSMC. Teasing TB 4 means it is 3 to 5 years from Market. They're teasing it as by then the transition to PCI-E 5.0 will have occurred. It's a Work-in-Progress.

Courtesty of PCWorld
Source:

So what exactly is Thunderbolt 4? Intel’s still not saying an awful lot.

Here’s what Intel told Tom’s Hardware when asked:

“Thunderbolt 4 continues Intel leadership in providing exceptional performance, ease of use and quality for USB-C connector-based products,” it said. “It standardizes PC platform requirements and adds the latest Thunderbolt innovations. Thunderbolt 4 is based on open standards and is backwards compatible with Thunderbolt 3. We will have more details to share about Thunderbolt 4 at a later date.”


Then the company further clarified that it was referring to USB 3.1 when it was referring to “USB 3”. USB 3.1 transfers data at 10 Gbps. Thunderbolt 3 transfers data at 40 Gbps. Four times faster. Simple, right?

Not really. And when PCWorld asked for clarification, Intel merely replied that “more details on Thunderbolt 4 will come at a later date.”

The bottom line? Who knows. Intel hasn’t done itself any favors here, and right now there’s not a lot to go on where either Tiger Lake or Thunderbolt 4 is concerned.

Note: When you purchase something after clicking links in our articles, we may earn a small commission. Read our affiliate link policy for more details.

By the way Tiger Lake will most certainly be delayed. Their 10nm fab is not happening for mass production consumer products until late 2021. Intel marketing claims otherwise, but internally it is so.
 
Last edited:
Don't underestimate the marketing effect on the public a (perceived) halo hardware manufacturer like Apple has.
Again, Apple is a marginal player in computer industry. It's a consumer electronics company now (and has been for quite a while). There is nothing "halo" about Apple in computers.
 
  • Like
Reactions: ssgbryan
Again, Apple is a marginal player in computer industry. It's a consumer electronics company now (and has been for quite a while). There is nothing "halo" about Apple in computers.

That's correct and that's why I wrote "(perceived)", thanks to Apple's marketing (many) people think that Apple represents top-notch personal computing devices - their laptops, for better or worse, get copied by almost all other manufacturers and the mobile sector is a large piece of cake for Intel. AMD is set to expand in this sector in 2020 with the Ryzen 4000 APUs that curb-stomp Intel's 14++++ nm SKUs.
 
  • Like
Reactions: ssgbryan
we are talking about CPU not space heater.
32 cores and each core is faster and use half power is a fact.

Pentium 4 Using 100w is not faster than a 15w core i7

Oh and AMD is much more power efficient than Intel that’s pretty obvious. If 200w AMD beating 300w intel is a fact that you can believe why 100w arm beating intel so unbelievable for you?
Clock speeds, Throughput of the cores, Amount of threads, memory bandwidth available, the fact that Amazon used sketchy testing methodology to selectively show the performance advantage over Intel CPU, in specific scenario, not overall, average performance. Those are the red flags that are obvious to anybody who has at least a little understanding of High-Performance.

Don't buy Amazons marketing. They have to sell their product. I haven't seen any real world, High-Performance application that actually runs on ARM cores well. And I mean: ARM cores. Not dedicated hardware for specific task that is in the SOC.

If I will see ARM cores that ACTUALLY are faster than x86, believe me I will be first to jump ship, just like I am constantly raving on this forum about AMD, because they are way faster than Intel. Im not stupid to ignore High-Performance.

But that ship hasn't sailed. It appears, based on how x86 progresses, it never will ship.

Erm..Google Arm Supercomputer....you are wrong.
And how fast is it compared to anything x86, CORE FOR CORE?
 
That’s funny, a “gaming mac”. If that happens it better be Apple’s first Windows PC because macOS isn’t even getting new Blizzard games anymore (Overwatch, supposedly no Diablo 4 now), and the 32-bit doomsday/Catalina killed older games.
I gave up and installed BootCamp. Apple’s drivers for BootCamp are better than they used to be. I have a 2019 iMac with the Vega 20 GPU. The Machine benchmarks about the same in Windows as it does in MacOS
 
AMD, INTEL, or SnapDragon, I don't care whats in the box I care about what I as the end user benefit.No one was asking what processor the iPod was running and no one probably knows what tech is inside the Airpod


I am not argueing, but there are people who say that ARM chips are just as fast as desktop chips but has the advantage of long life battery and cooler operation. Don't shoot the messenger.

Apple dumped Motorola and IBM 68000 CPUs like a hot potato when they couldn't keep up with Intel. There's no reason to think Apple wouldn't do the same to Intel if they can't keep up.

Jobs...Jobs dumped Motorola not Apple. Jobs was brave enough to seek the best user experience even if it meant to reprogram all the software. Cook looks what makes the profit section goes higher on the balance sheet.
 
HA! I knew it! I mentioned it in the past, it's more likely Apple will go with AMD CPUs than releasing computers with their own CPUs.
 
Who, the **** cares about Power Efficiency in High Performance Computing sector?

What moron cares about it? Guys, Apple has just released a desktop computer that has 1.4 kW Power Supply!

The last thing High-Performance Computing cares about is power efficiency. This debate is beyond ridiculous.
Uhh, what moron doesn't?
I work in surplus and recycling old IT equipment and some of our biggest customers are the government agencies. I've seen thousand of servers in my time in the industry and they keep getting more efficient with better cooling.
Guess if the world ran your way a standard server rack would require its own small scale coal factory behind the building.
 
  • Like
Reactions: ssgbryan
Do they still have anything to do with Macs?

I thought they fell out of favor with Apple years ago, nor does nVidia ever mention Mac support/compatibility when it comes to their vid cards sold to the general PC market.
that's why i'm asking. need nvidia to come back.
[automerge]1581123585[/automerge]
This is for processors, not GPUs. Apple does not seem to have any plans to switch back to Nvidia for their GPUs.
The post talks about references to AMD's APU's which combine CPU + GPU. Regardless, that's why I'm asking about it. Tim has said in the past that they had no plans to leave Intel.
 
A Mac mini with an Ryzen APU ...

THAT would be the ultimate entry drug to macOS.

BYOD - Bring your own display ;)

Also this would make a LOT of owners of old iMacs upgrade.

On one hand, this would be awesome.

On the other hand, this would make me regret my recent Mac Mini purchase
 
  • Like
Reactions: Si Vis Pacem
we are talking about CPU not space heater.
32 cores and each core is faster and use half power is a fact.

Pentium 4 Using 100w is not faster than a 15w core i7

Oh and AMD is much more power efficient than Intel that’s pretty obvious. If 200w AMD beating 300w intel is a fact that you can believe why 100w arm beating intel so unbelievable for you?

And SMT isn’t a computational performance .
It is a optimization method to fully utilize single core resources when super scaler can not do that automatically.

That’s a useful feature when your cpu pipeline have too many “bubbles”. When arm getting same bubbles they will implement SMT too.

They don’t need it right now doesn’t give it any disadvantage against Intel.
For those believing in ARM architecture, and Amazon's claims:
This is testing of the first Graviton CPU vs AMD and Intel, from 1.5 years ago:
Power, ARM vs EPYC and Cascade Lake Xeons.

This is the performance of 64 core/128 Thread CPU from 2020:

When I will see that in real world, average applications ARM is faster than 2020 x86 CPUs, which it isn't, that is the moment when I will start saying that yes, Apple should switch to ARM.

I will simplify the blow. It is not ready for anything High-Performance. Never will be.

P.S. The test from 2019 compared ARM CPUs which had around the same performance, as Graviton CPU.

Even if Amazon is not blatantly lying about performance of their CPUs, 7 times faster does not make it capable to climb the performance mountain it has in front of them to beat x86.

This is the end of this debate, I hope. ARM is nowhere near the x86 performance. Period.
 
If we are talking 'high end gaming' then it would need to be closer to the $2k, you are not going to switch people from a gaming PC otherwise.

If the current rumor about the "gaming" Mac is true, then I think that makes this AMD rumor more likely.
[automerge]1581123968[/automerge]
Intel’s fab issues aside...the $64,000 question is WHY would Apple maintain both in a shrinking PC industry? Why would they move from Intel to AMD given that the Mac sits at 8% of it sales? Because Intel is not doing great, but they aren’t so bad that Apple couldn’t ride it out a while longer.

Apple might feel otherwise. Because Intel really sucks right now.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.