Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Maybe they could have two models? The backwards compatible one and the fresh, new, clean slate from here on? That would require TONS of cooperation for things like Apple does with Universal executables and whatnot on the PC side though.

If Intel announced a new x86 architecture that removed all the ancient cruft and focused on keeping compatibility with what's most useful to todays applications, would it succeed?
I doubt it. I mentioned on another thread that Apple’s benefit is that they don’t have to worry about anyone offering “the same thing just slightly better” while they’re trying to offer “A WHOLE NEW PARADIGM”. Because there are other companies waiting in the wings to offer sideways solutions in competition against any new thing that Intel creates, there’s only so much innovation they’re able to do. Now, would it be possible for them to cooperate with AMD to create a new amazing thing? Sure. The question is, would AMD see it as a business opportunity they need to be a part of OR a way for Intel to maintain relevance? If AMD saw it as the second, they’d turn Intel down and just watch them flounder.
 
Just be aware that the Intel Macs would run a LOT cooler if they had the same quality of cooling design. Sure, still not as cool as the M1 Macs, but a LOT cooler than they are.
The Intel Macs would run a LOT cooler if... ahhh, if they had the same quality of processor as the M1. You know, small, powerful, efficient? The kind of processor Intel told Apple they were making, buuuut Intel kinda missed their mark by about 20-30 watts.

They might since they are planning to expand their datacenters to align Apple towards more services. Where do you think they will host those services? On Intel Hardware or Apple Silicon?
Intel or whatever other mass market processor is available. Datacenters are commodity things. If there’s no benefit to differentiate at a hardware level, then there’s no need to.
 
Still pushing that story, huh. OK, let's get the "former CPU designer" quote, you know.. to get the technical and clearly not just yet-another-normie analysis:
“Digital content creators” WILL do fine with apple silicon. That doesn’t even take any technical knowledge OR being a processor architect to realize. Most of the current tools run as well or better in Rosetta and the performance is only going to improve as the apps become native.

”While standing on the ground, letting go of a rock will cause it to accelerate downwards” is another “normie” analysis that it’s just realllllly not worth challenging. You COULD though. You totally could. :)
 
  • Like
Reactions: Nightfury326
Just keep in mind there are two factors to Apple M1 running much cooler:
1) the M1 is cooler than Intel
2) the M1 Macs have very well designed cooling systems (fans, heat sinks, heat pipes, heat spreaders), whereas the Intel Macs have such horrendously bad cooling designs that it beggars belief.

Just be aware that the Intel Macs would run a LOT cooler if they had the same quality of cooling design. Sure, still not as cool as the M1 Macs, but a LOT cooler than they are.

The Retina and earlier models (mid-2015 and earlier) all ran nice and cool with good cooling systems (I run my Retina MBP on my lap in shorts all the time, it's not even beyond lukewarm to touch, and very very rarely runs up the fan, basically just like the M1 Macs). Apple gimped the models that came after that. And then fixed them again for M1.
I'm not sure if Intel's heat issues is due to the Mac's internals. I have a mid 2019 15" MBP. Apple made significant improvements on the 16" which came out about 4 months later, with redesigned cooling features etc.

The reality is it all comes down to Intels mobile chips and how fast they heat up vs the M1, as heat is an issue in other brands of laptops too with the last few years of Intel processors.

The mid 2015 Intel processors were a completely different design so its not a reasonable comparison. I agree that having an larger enclosure would help though as the 2016-2021 MBP design is pretty tight, but I bet there would still be issues.
 
  • Like
Reactions: Unregistered 4U
I strongly disagree that Apple has a much better management. Apples current and merely temporary advantage is that they are still sitting on a larger cash pool and their main products are still tanking. But for half a decade now Apple hasn't done anything particular innovative.
That's a bold statement. There is a big list of products (while I don't love them all, I would still call some innovative), and even design choices that have pushed the industry forward as a whole to provide better technologies.
 
  • Like
Reactions: Nightfury326
They might since they are planning to expand their datacenters to align Apple towards more services. Where do you think they will host those services? On Intel Hardware or Apple Silicon? I bet they will also release a super high end processors for workstations that they will also use in their own servers. So if they are gonna make new servers for themselves why would they not release some of those servers for the massmarket ? I suppose they can and they should.
Right?! The lowering of carbon footprint alone would be enormous. Going from an average power consumption of 300w per server to 10-25w would be massive. And that doesn't even take in to account the lower cooling requirements.
 
Right?! The lowering of carbon footprint alone would be enormous. Going from an average power consumption of 300w per server to 10-25w would be massive. And that doesn't even take in to account the lower cooling requirements.

It also means fewer and smaller buildings - one thing people forget is that you can only bring so much electrical power and cooling into a building. The total building volume goes way down if you use M1s instead of x86 chips.
 
Intel is already screwed.

Shows you what bad management will do and how valuable good management is in business.

Apple has FANTASTIC management.

Clearly you don't know many people at either company.
Apple has been a sweat shop for years. Employees complain about burnout and what keeps them there is the RSUs.
Intel has the same issue, but people are leaving.
Both have a culture where being different or thinking outside the box is not what they want.
Only people in "creative positions" have their opinions valued.
Diversity is horrible and managers hire who looks like or talks like them.

As a chip architect, I'll pass on either company.
 
The problem for Intel is that the obvious solution, spinning off its fabs, will kill it in the long run. The only advantage Intel ever had over AMD was its fabs. Intel’s designers are terrible.

I won't say the engineers and architects are terrible.
I think their design process is terrible.
 
They are in a no-win scenario. Their advantage had always been their fabs. They use weird internal processes that don’t mesh with the rest of the industry (when I’d interview people from intel to come work at AMD, we never hired them, because we never understood what they were talking about. They used mils instead of microns. It was wild.). And their designers aren’t great. So if they can’t fix their fabs, using a third party fab is a big problem for them. Negates whatever advantage they might have, is hard to do, and is a cultural problem.

The are definitely stuck in the Stone Age with design methodology.
 
  • Like
Reactions: Nightfury326
Mr. Dell should worry about his own company. Their day is coming too.
Exactly,
Well AMD spin off Global foundries so I’m sure Intel could spin off theirs. But we have to remember Intel is still making money unlike AMD in it’s bad years so I’d be surprised if they don’t pull something out over the next few years. Those ringing the death bell for AMD too are being premature, they’re ARM licensees already and I’d be shocked if they didn’t have an ARM design in the pipeline. For now though, I’m going g to enjoy watching Intel take a beating
Intel can't spin off its fabs.
They make more than processors and those fabs are an advantage, no matter what some people think.
If you are in a commercial fab, you are at the mercy of the capacity and schedule of the fab.
Intel can do hot lots, experimental wafers etc.
People need to stop comparing process nodes based on what a company calls it.
7nm just means the smallest feature size and not transistor density.
 
Last edited:
Sadly Intel is hopelessly behind Apple. Look at the M1. When could we have realistically expected similar performance at the power usage of the M1 from Intel???

Compared to what?
The M1 has no external memory bus.
It's unified memory architecture for CPU and GPU operations is a bottleneck.
It doesn't support very many lanes of PCIe Gen3 and no Gen4 support.
For all intent and purpose the M1 is a general purpose embedded processor with limited external peripheral support.
Call me on power consumption when it can support more than 16GB of memory via LPDDR4 or LPDDR5.
 
  • Like
Reactions: MBAir2010
M1 is both deeper and wider than current Intel designs.



Lost how? Apple’s laptop chip is beating Intel top end hardware. The 4 fastest computers in the world aren’t CISC. By what metric did CISC win?

And in the past tense, like “it’s over, go home, no reason to try anymore?”

The current iteration is far from a pure RISC processor.
ARM hasn't been RISC in years.
 
I won't say the engineers and architects are terrible.
I think their design process is terrible.

As the guy who created AMD’s design process in the early 2000s when I co-ran EDA in Sunnyvale, it’s hard for me to see a difference there. Engineers create the process.
 
They just started switching their entire mac lineup to their own chips, contrary to all industry common wisdom and the voices of 10,000 uninformed MacRumors posters that Arm is not suitable for anything other than phones.

And you feel like they “dare not experiment?”

Ok. Whatever.

ARM is suitable for plenty people just weren't paying attention to Ampere and others.
The big issue that I see is that people want to compare the M1 against processors that have much better peripheral support and an external memory interface.
It's not an accurate comparison until apple has LPDDR and better than 16GB of memory and 24-48 lanes of PCIe.
Those differential drivers for PCIe ar power hungry.
DRAM controllers and schedulers aren't free.
 
ARM is suitable for plenty people just weren't paying attention to Ampere and others.
The big issue that I see is that people want to compare the M1 against processors that have much better peripheral support and an external memory interface.
It's not an accurate comparison until apple has LPDDR and better than 16GB of memory and 24-48 lanes of PCIe.
Those differential drivers for PCIe ar power hungry.
DRAM controllers and schedulers aren't free.

Not free, but comparatively cheap. The 16” MBP and higher end 13” MBP will use a variant of M1 with those features (and probably a bigger L2).
 
  • Like
Reactions: Nightfury326
As the guy who created AMD’s design process in the early 2000s when I co-ran EDA in Sunnyvale, it’s hard for me to see a difference there. Engineers create the process.

The engineers being hired now or in recent past aren't the ones that created the messed up design methodology at Intel. I separate the two because Intel's methodology does not encourage thinking or innovation. Most engineers at Intel don't even know how to run synthesis tools.
But let's face it, at Apple they also don't want the rank and file engineer doing much thinking. They like, for lack of a better term, "vertically integrated" engineers. They can do a couple of things really well and don't/can't do much else.
 
Then why is Boeing still in business? Stonecipher's shift of Boeing from an 'engineering firm' to a 'business' sure seems to have gone very well.

I remember so many people being furious at Commodore slashing prices after they bought C64's. You don't build customer loyalty by crapping on them like that. Their OS was a mess too. I remember trying to figure out how to format a floppy, and it was not easy. The Atari OS was, or seemed easier. *shrug*

But I do remember, working at a reseller of both brands, people being very upset at the Commodore price slash.
Boeing is "too big to fail". Honestly, so is Intel. People are predicting their demise, but if they really were in trouble, the government will bail them out. They're not going to let the company with America's most advanced semiconductor tech go under.
 
Runn

So the thing is Running a fondry is about volume, also you cannot stop or pause a foundry it has to keep working to break even. And that takes decades of Investment. If Apple buys a foundry it would need to either sell its chips to the outside world to meet the volume demands or start manufacturing chips for others designs, Apple isnt a company that would do either. So no they will never buy a foundry, it makes no sense.
Were you under the impression that Apple wouldn't be able to fill its foundries if it owned one?
 
Not free, but comparatively cheap. The 16” MBP and higher end 13” MBP will use a variant of M1 with those features (and probably a bigger L2).

For me, those will give a much better picture of the processor potential and power consumption.
Right now the M1 is great for an entry level machine.
I want to see the processor with 8/12 cores and 16/24 threads with 64-128GB of RAM, 32 lanes of PCIe Gen3/4 that is running my ProTools/AVID setup.
As they go higher performance, on chip memory bandwidth will also be an issue and a unified model for CPU/GPU memory isn't optimal.
Now we need to see them replace a Xeon.

I think it's interesting.
 
The engineers being hired now or in recent past aren't the ones that created the messed up design methodology at Intel. I separate the two because Intel's methodology does not encourage thinking or innovation. Most engineers at Intel don't even know how to run synthesis tools.
But let's face it, at Apple they also don't want the rank and file engineer doing much thinking. They like, for lack of a better term, "vertically integrated" engineers. They can do a couple of things really well and don't/can't do much else.
I also do not know how to run synthesis tools :). (Ok, i know *how* to run dc_shell, and I even know how to write and read verilog, but I certainly don’t enjoy it or advocate that anyone do it :)
 
They felt their market dominance - especially in the datacenter space where it reached 90% recently - was enough to allow them to coast. And their margins on those Xeons was so high that they had no compelling reason to invest significantly on something better.

Intel's real threat is their largest datacenter customers developing their own chips or going to AMD at massive scale. Amazon has released the second generation of their AWS Graviton series and Google and Twitter are now starting to deploy EPYC in addition to Intel. Neither is a threat at the moment, but when you consider how many millions of Xeons Intel sells per year to Amazon and Google alone, any significant drop in those sales would have a serious impact on Intel's revenues.
Damn that just got me thinking of the energy, and therefore cost,savings of ARM in a huge data center. The direct energy savings coupled with the smaller cooling load etc. would seem to be a huge incentive to fast track whatever is needed to start the changeover.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.