Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
You clearly didn't watch the vids.
No one needs to watch the vids to know that Intel has been blowing their projections for their chips for years. I mean, THE ARTICLE is about Intel missing their goals year over year. Not only have they lacked the ability to ship on time, when those chips ship, they are outside the thermal envelope Intel promised. The more information that comes out regarding Intel, the more it really comes down to one thing.

If Intel had shipped what they promised, the systems in the video would have been fine. Intel didn’t.
 
In very ideal not real life situations ;-)
Did I say Apple marketing veil? They were always very very good at that. A huge percentage of their success stems from marketing and not innovation.

So you haven’t paid attention to any of the reviews, any of the YouTube videos, any of the tweets, or any of the comments here by actual users?

Because all of them say that M1 blows away the Intel parts they replace in performance, and do so while consuming a fraction of the electricity.
 
Since when is being better the Intel, a copany that according to this article is on a downward spiral, being good? Apple could do so much more with a good innovative management.

Biggest company in the world. Uses an innovative management structure that no other large company uses. Dragged the entire mobile industry to 64-bit computing. Transparently replaced the file system on billions of devices without anyone noticing. Moved from x86 to its own chips and everyone loves the result. Successfully introduced a system to allow developers to simultaneously target mobile and desktop devices, something everyone else had failed at.

I don’t think ”innovative” means what you think it means.
 
Ok, don't believe the vids then, no worries. But at least verify it for yourself. Get hold of a late model Intel MBA, and take the back cover off. You will be blown away to see that the heatsink sitting on the CPU has no heatpipe at all connecting it to the fan (exactly like in the vid). If you don't understand the significance of that, then find someone technical that you trust to explain it to you. Then get hold of an M1 MBA and take the back cover off, and check out the massive heat spreader that connects to the CPU's heatsink

Verifying that the Intel MBA heatsink doesn't sit flush onto the CPU is a bit more tricky, but I'm sure you can manage if you want to. Similarly tricky to verify that the M1 MBA heatsink does sit flush. And again you could find someone techy to explain the significance of that.

The no heat pipe vs heat spreader is the simplest thing you don't believe that you could actually verify for yourself.

And don't get me wrong, I'm not pooping on the M1 MBA at all, it's brilliant and game changing.

What I am doing is showing that the heating solution of the Intel MBA is woefully gimped, and that the overheating problem is actually more to do with Apple's design than Intel's chip. Why? My guess is as good as yours.
How about it does the task it was designed for. The MBA was never designed for sustained load so it didn't need that cooling solution. Anything can be improved. A good example is if you have a desktop PC with a stock cooler you can significantly improve the cooling by adding an aftermarket cooler. It's easy for a guy to make some YouTube video because he hates Apple and act like he knows more than engineers who designed the computer. He quickly gained a following of Apple haters who reference and share his videos like they are gospel

Which wasn't very difficult I'd say. On the contrary... you would have to be a huge moron to not make a very successful company out of 2011 Apple.
Of course. Anyone could do it. Tim isn't anyone special. I bet you could be the CEO of Apple. In fact I think you should apply. Just walk into an Apple Store like you own it and say "I'm in charge of this now"
 
Last edited:
Intel should have "read the room" and should know have already known this was going to happen. Anyways I double Apple will even deal with them anymore with the exception of modems for iPhones as it should be. Intel should just die off they are too massive and cannot pivot with a quickness.
 
The Intel Macs would run a LOT cooler if... ahhh, if they had the same quality of processor as the M1. You know, small, powerful, efficient? The kind of processor Intel told Apple they were making, buuuut Intel kinda missed their mark by about 20-30 watts.


Intel or whatever other mass market processor is available. Datacenters are commodity things. If there’s no benefit to differentiate at a hardware level, then there’s no need to.
I think Apple is a company with very long term vision 5-10 yrs long, if you look at Metal for instance, they released Metal for iOS in 2014 when they totally got burnt by Khronos group which quite literally destroyed OpenCL, an Apple technology which was Opensourced for standardisation. And we saw what happened CUDA tookover the market because of Khronos bureaucracy. And only now we are seeing the benefits of Metal and Apples long term planning for gaming and Machine learning Compute push on Apple hardware. So if you look at it long term most compute will be on the servers and would be streamed back to you for a small subscription. Apple would need to run Apple Silicon on the servers for that to happen. And streaming is coming Apple knows that.
 
they are not stalling by choice, when you get rid and make life miserable/ no longer reward the lifeblood of your company you will eventually cease to exist.
What is the biggest improvement in 10 yrs from Intel? They were stalling by choice because they became complacent and thought well no body caught up to them in a decade so why bother , lets sell the processor that costs them $ 25 to manufacture at a marker of $1000+ , thats what Intel did people were pissed off but had no choice. Server Market was was even more bloated $6-12K for a high end Xeon. Thats the level of extreme productisation Intel did, clearly no body liked it but had no choice. AMD changed that with EPYC on servers, and now Apple Silicon is changing that on the desktop. Intel is screwed and for Good.
 
Deep and wide refers to the penetration into the market and not decoding capabilities or branch prediction.

Lost as in market share.

Oh... 😄 No argument there. Intel dominated just about every level of the CPU market over the past couple decades. I’d argue their powerhouse status began before then, in their earlier innovations around microprocessor design, but they grew to pretty much own the general computing market.

That status doesn’t have to last forever though. We’ve seen plenty of other dominant players fall into obscurity in the blink of an eye.

The business school case study of it all will be the fact that the one part of the market Intel didn’t dominate, cheap low power toys and gadgets, is where the seed was planted that grew to threaten the entire Intel empire. Intel dominated market share for general computing. Not for the low power gadgets that evolved into smartphones. I’d have to dig up the numbers again, but I’m pretty sure Apple ships more processors for more profit than Intel does.

That’s where the tipping point is. Intel managed to keep x86 dominant because they had the money to do it. The best designers, the best process engineers, and an industry focused on optimizing for their architecture.

Apple, it turns out, has been using their massive market share, profits and relationships from mobile devices to match Intel on each of these and since they’re starting from a better CPU architecture and are using a more modern system view they’ve reached the point where they can beat Intel at their own game. To the point that they can just about run x86 targeted code through translation faster than Intel can run it natively.

When the money spigot to Intel starts to close, they’ll struggle more and more to keep pace with the competition with ever more limited resources.

You originally phrased your statement as RISC vs. CISC saying CISC won. That seems like the wrong way to look at it. CISC as a philosophy didn’t win anything but x86 dominated the market for a while. As a design philosophy, CISC is dead. (Actually, as a design philosophy, CISC never existed— it was only coined in retrospect to contrast with RISC, which was a design philosophy). Can you point me to a single new CISC design in the past 30 years?

Apple has a big enough hammer to force a square peg in a round hole. Headphone jack, power brick, 32 bit apps and that's the iphone, they forced a number of other changes in the Mac lineup over the years. If Intel did the same thing, they would be brought up on anti-trust charges.

Intel wouldn't be brought up on anti-trust for dropping peripheral support... Intel picks and chooses the technologies they support directly (USB vs. Firewire?) As far as I'm aware, they were never sued for dropping support for the ISA bus, or the EISA bus, or anything else. They wouldn't get called in by the DOJ for dropping support for segmented memory.

Intel silicon is laden with legacy functionality, that if they could remove, might be able to beat everyone else at this game.

But it's not happening. Intel doesn't produce niche chips in the general purpose computer market (ie non-mobile phone market), and they have to support what is out there.

And this is the albatross around their neck. If they keep thinking that their current "one architecture to support every possible use case" is the way to go, they'll find x86 is a niche chip that is only used by legacy applications.
 
Last edited:
so many intel haters
just wait when the apple arm does not satisfy digital content creators
you guys have a lot hope in apple that these m1 processors and all their apple proprietary GPU and other BS will meet your needs

I personally LOVE intel and building my own PC.
Every iMac I have ever owned has had a bad video card making it useless for graphic design or video editing.
The fact that Apple's WORST processor is quite competitive at 4K and just barely struggles at 8K speaks volumes. And that Apple's WORST processor is also the least wattage. When it gets in the Mac Pro with the large headroom for very high TDP processors, things will get crazy.

Also, Apple's WORST processor beats or at least competes with NVIDIA GTX 1050ti which is now I think #2 in the Steam hardware survey speaks volumes too - again doing this at only ~13.5 watts. Again, imagine the GPU in Apple's Mac Pro processor.
 
I am no server expert, i have only used servers from within AWS Console, can you explain what is the difference between a server in a datacenter and off the shelf as a product?

A product like Xserve was designed to support a wide range of server workloads, not just web and db serving. A productized version would also have to be reliable, support a wide range of configurations, etc. Whereas Apple, for its own use, can just make really cheap boxes that, if they fail, are automatically replaced by redundant boxes, and which only have to run a couple applications.
 
so much so that even unplugged on battery power the benchmarks are exactly the same (and sometimes faster as it heats just a bit less without power delivery charge) while the other OEMs would sometimes have half the scores.
I think this gets overlooked SO MUCH. I always hear people saying their laptop is SO MUCH better than Apple ones, yet when unplugged it is pretty much half the performance like you said. If you want to keep something plugged in all the time I would look at a desktop instead.
 
No one needs to watch the vids to know that Intel has been blowing their projections for their chips for years. I mean, THE ARTICLE is about Intel missing their goals year over year. Not only have they lacked the ability to ship on time, when those chips ship, they are outside the thermal envelope Intel promised. The more information that comes out regarding Intel, the more it really comes down to one thing.

If Intel had shipped what they promised, the systems in the video would have been fine. Intel didn’t.
Correct. But Apple also gimped the cooling of the recent Intel laptops. Both are true. The M1 chips are awesome and game changing. But the Intel chips don't run as hot in Apple laptops as everyone thinks, they just have gimped cooling.
 
Correct. But Apple also gimped the cooling of the recent Intel laptops. Both are true. The M1 chips are awesome and game changing. But the Intel chips don't run as hot in Apple laptops as everyone thinks, they just have gimped cooling.
They did not "gimp" it, Intel missed their mark of thermals they promised Apple. Unless you wanted to wait another ~2 years for Apple to redesign their entire laptop to fit the ACTUAL thermals that were produced. Apple did all they could with what was promised by Intel (lower heat -> types of laptops we received) and failed to meet those promises (much hotter than what Intel stated -> okay lets delay all our products for 1-2 years for a redesign?).

My 2019 i9 iMac gets WAY TOO LOUD and WAY TOO FAST. It is so irritating. But I do not blame Apple for this.
 
They did not "gimp" it, Intel missed their mark of thermals they promised Apple. Unless you wanted to wait another ~2 years for Apple to redesign their entire laptop to fit the ACTUAL thermals that were produced. Apple did all they could with what was promised by Intel (lower heat -> types of laptops we received) and failed to meet those promises (much hotter than what Intel stated -> okay lets delay all our products for 1-2 years for a redesign?).

My 2019 i9 iMac gets WAY TOO LOUD and WAY TOO FAST. It is so irritating. But I do not blame Apple for this.
Source ?
 
How about Intels roadmap a few years ago? It was stated for less power but ended up being more TDP than expected. Intel has been over promising and under delivering for many years now. How about the fact that we are NOT supposed to still be on 14nm? Have you seen what has been going on for many years? The jokes you see are actually true, 14nm+++++++++ is what we are on. We should be on 7nm by now....which would have much better thermals and the laptops/macs in general would be much cooler.

Apple didn't just wake up one day and change this. Here are some to get you started that mentions a lot of problems with the Intel processors.


 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.