Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I’m most looking forward to possible fanless, yet powerful Macs

Same. I get triggered when my MBA starts its fighter jet fans when I'm not even running intense applications. I always have to go find out what stupid system service has lost its mind and kill it. It gets old quickly.

If I were rendering 4K or playing a modern FPS, I'd understand.
 
  • Like
Reactions: R3k
While I can see that helping the cause, I think in general Intel's lack of innovation, i.e., cannot get off 14nm technology. You can only go so far with adding cores, you need real innovation

Exactly. You don’t scrap your entire system architecture over one screw up. This was brought by repeated disappointments and over-promising from Intel over many years & chip generations. This was a systemic problem.
 
Exactly. You don’t scrap your entire system architecture over one screw up. This was brought by repeated disappointments and over-promising from Intel over many years & chip generations. This was a systemic problem.
May have been the straw that broke the camel’s back. Bad enough intel keeps promising chips it can’t deliver, causing apple to engineer machines for chips that never come and having to squeeze chips into thermal envelopes they aren’t intended for; bad enough that using the same chips as everyone else makes it much harder to differentiate in the marketplace; but now intel isn’t even sending us chips that work right.

I could see it playing out that way.
 
CPUs have effectively reached the peak necessary to run the apps most folks need. The real innovation now needs to be on the graphics card/chip side. It doesn’t do any good to reduce the footprint and heat factor of the main CPU if you’re just going to make up for it all on the GPU.

I agree on a raw performance perspective - but now the move is likely towards efficiency - what if you could have today's power with 2-3 days of battery life? An iPad can run for +/- 1 day with moderate use - now take that and double/triple the battery size and you now have a laptop that is really pushing the boundaries of what people think/expect. I see this as a deeper push to make the MacBooks (Air and Pro) last for much longer - when you add in a possible future with OLED/MicroLED screens - we could be looking at 4-5 days of battery life for a 16inch Macbook Pro - one can dream :)
[automerge]1593094382[/automerge]
AMD has been doing nothing but innovating lately, and Apple only uses AMD graphics cards exclusively, so I don’t see what you mean.

Apples use of AMD Graphics is less about AMD Graphics and how Apple and NVidia can't seem to get along - I doubt anyone at Apple is happy with AMD Graphics at this point - NVidia is 2 full generations ahead in the mobile space and lightyears for the workstation world.

If Apple/NVidia patch things up or if Apple makes their own GPU (or buys Intels new one) AMD could be in a world of hurt too.
 
I agree on a raw performance perspective - but now the move is likely towards efficiency - what if you could have today's power with 2-3 days of battery life? An iPad can run for +/- 1 day with moderate use - now take that and double/triple the battery size and you now have a laptop that is really pushing the boundaries of what people think/expect. I see this as a deeper push to make the MacBooks (Air and Pro) last for much longer - when you add in a possible future with OLED/MicroLED screens - we could be looking at 4-5 days of battery life for a 16inch Macbook Pro - one can dream :)
[automerge]1593094382[/automerge]


Apples use of AMD Graphics is less about AMD Graphics and how Apple and NVidia can't seem to get along - I doubt anyone at Apple is happy with AMD Graphics at this point - NVidia is 2 full generations ahead in the mobile space and lightyears for the workstation world.

If Apple/NVidia patch things up or if Apple makes their own GPU (or buys Intels new one) AMD could be in a world of hurt too.
Apple is making their own GPU. They’ve been quite open about that, and discussed it briefly during one of the WWDC sessions.
 
Throttling up the frequency might be possible, but if Geekbench is even remotely accurate they don't even need to get into that, they just need to get at least six A13 (or future A14) cores working.

The current, as-is, 2.7Ghz A13 in the iPhone 11 registers single-core performance only 7% slower than the fastest single-core chip Intel (or AMD) makes, the desktop 3.7GHz Core i9-10900K. The A13 is basically equivalent in single-core to a desktop Intel Core i9-9900K @ 3.6GHz and faster than any i7 and anything that AMD makes.

It's a little hard to guess what the performance hit of more cores is, but you can kind of get an idea by pretending they could do multiprocessor instead of a single die. 3x A13, each of which has a 6W TDP so 18W TDP total, would be faster than any laptop Apple currently sells, and competitive with a top-of-the-line iMac. Pretending an 8-core "A13Z" was similar to 4x A13, it would have a TDP of 24W and approach the performance of a desktop 12-core i9. A hypothetical 10-core, equivalent to 5x A13, would be 30W and faster than anything other than 24+ core Xeon or Threadripper CPUs, so competitive with 125W chips.

All of which is to say Apple doesn't need to do a thing to clock speed, they just need to be able to maintain around the same performance and per-core thermal envelope when adding more than 4 cores. Even making it to 6 would be highly competitive for laptops or iMac-style desktops at a fraction of the power draw.

What would be a turn around on the "designing into a thermal nightmare" of a mac pro 2013 design concept and style is to come out with a mac pro 2013 idea with their own new silicone CPU in it and say, "Now THIS is what we were talking about...innovation my ***". Don't change anything but updating thunderbolt ports etc.

Finally match form AND design and check off the mac pro 2013 concept as "perfection" and what it WAS suppose to be... :)
 
The problems with Skylake were reported by many tech sites after it was released. However, the conclusion that the problems with Skylake caused Apple to shift to ARM is an unmerited conclusion. Intel's issues may have been a factor, but Apple's movement toward ARM-based Macs was happening prior to that.




At this week's WWDC, Apple confirmed its plan to switch from Intel to custom processors for its Macs over a two-year transition period. Apple said that the switch is all about platform consolidation and performance advantages, but at least one former Intel insider claims that quality control issues with Skylake chips was the reason Apple finally decided to to ditch Intel.

16-inch-macbook-pro-intel-10th-gen.jpg
There have been rumors suggesting Apple has an interest in Arm-based Macs for years now, but it was only on Monday that Apple confirmed the plan, satying it expects its first Mac with custom silicon to launch by the end of 2020.

Apple analyst Ming-Chi Kuo believes that a redesigned iMac due in the fourth quarter of 2020 will be one of Apple's first two Mac models with a custom Arm-based processor, with the other being a future 13-inch MacBook Pro.

Following Apple's announcement about its switch to custom silicon, Intel said it will continue supporting the Mac through its transition, but insisted that its processors are still the best option for developers.

Article Link: Former Intel Engineer Claims Buggy Skylake Chips Hastened Apple's Switch to Custom Silicon
 
  • Like
Reactions: warnergt
This also gives Apple license to update specs LESS often. They won’t have to listen to everybody whining about Intel Xth Gen.

This may allow them to focus on features and design rather than chasing marginal gains from Intel.
 
  • Like
Reactions: NetMage
Now, they just need to have displays that use next to no power and we'll have full-day, full-performance laptops where you forget where the outlets are in your house or at work.
 
Case in point: MBP where 2 ports are reduced speed - because the INTEL CPU did not have enough PCI lanes.

The comments mostly blame Apple: total ignorance of CPU-TDP, or DRAM which has to match the CPU-interface.

The next year Intel FINALLY stopped crippling their 15w and 28w TDP CPUs with x12 lanes and went to x16. The 45w TDP H-Series and 65w-125w TDP S-Series CPUs are rumored to get x20 lanes this year, and PCIe 4.0 at that, which I will believe it when I see it. I’m more interested to see if they update the DMI interface to PCIe 4.0. Intel’s tagline should be - “Dragging our asses kicking and screaming since 2017!”
 
Exactly. You don’t scrap your entire system architecture over one screw up. This was brought by repeated disappointments and over-promising from Intel over many years & chip generations. This was a systemic problem.

Intel had a breakthrough with Ivy Bridge and Haswell had a rough start, but ended up solid. It was Broadwell that sank them and the quick ramp up with Skylake while trying to bury Broadwell that just continued things down this road we’ve been on for the past 5 years. I will be glad to get off the rollercoaster, even though I may slightly miss the broader compatibility and plethora of choices I had, I just won’t miss Intel themselves.
 
  • Like
Reactions: incoherent_1
Intel had a breakthrough with Ivy Bridge and Haswell had a rough start, but ended up solid. It was Broadwell that sank them and the quick ramp up with Skylake while trying to bury Broadwell that just continued things down this road we’ve been on for the past 5 years. I will be glad to get off the rollercoaster, even though I may slightly miss the broader compatibility and plethora of choices I had, I just won’t miss Intel themselves.

In the long run this will lead to more choice.

The wintel monopoly is dead. Windows lost its stranglehold on corporate IT once they realized that they had no choice but to support BYOD for mobile, so now they are much more open to other choices (even if some, like chrome books, are bad choices).

And now people will see that “arm is for low-powered gadgets” is wrong, and other companies will start down the path of other processors (maybe arm, maybe other things). We may return to the glory days of real differentiation in the marketplace, but this time with a measure of interoperability (due to the internet, the fact that MS is open to Office running on every kind of device, subscription software plans that encourage ISVs to do ports, etc.)
 
Intel had a breakthrough with Ivy Bridge and Haswell had a rough start, but ended up solid. It was Broadwell that sank them and the quick ramp up with Skylake while trying to bury Broadwell that just continued things down this road we’ve been on for the past 5 years. I will be glad to get off the rollercoaster, even though I may slightly miss the broader compatibility and plethora of choices I had, I just won’t miss Intel themselves.

Yep. Agree fully. It would’ve been interesting to be a fly on the wall when solutions were discussed. I’m sure switching to AMD was given intense consideration. But given the insane year-on-year performance improvements with their ARM chips, I can see why they opted to go that route.
 
  • Like
Reactions: Zdigital2015
Yep. Agree fully. It would’ve been interesting to be a fly on the wall when solutions were discussed. I’m sure switching to AMD was given intense consideration. But given the insane year-on-year performance improvements with their ARM chips, I can see why they opted to go that route.
I dont think they considered amd for long. AMD has given no indication that it can stick to a roadmap year-after-year and also sell the entire range of chips from low to high end that Apple requires.
 
  • Haha
  • Like
Reactions: NetMage and itguy06
The move was bound to happen if nothing more due to costing. The chips made in house are already paid for RandD from the phone and tablet what is it to scale them up apple thinks. They already pay the fab to make them what is it to make a few more bigger ones. The costing was always going to be the trump card. The interoperability of the intel platform made lots of sense until intel just frankly could not deliver. When apple took a massive black eye from pro for feeling like they had been abandoned I knew the end was coming for intel one day sooner or latter. I admit I always thought apple would buy a slice of AMD and just integrate some of that tech into a AMD X86 half made in house processor. This however also was a forgone conclusion once you saw them working so hard to spin the code both ways from IPad to OS X and OS X to iPad. Once that connector exists good or bad you internally can justify another platform move. Intel had a shot but they shot them selves in the foot like IBM. Apple now has the chops to step up and take a crack at it. I welcome the attempt
 
While I can see that helping the cause, I think in general Intel's lack of innovation, i.e., cannot get off 14nm technology. You can only go so far with adding cores, you need real innovation

This is where I see a slight disconnect. If technology node innovation and die shrinking is what Apple wants, then they're not really switching from Intel to in-house, but rather switching from Intel to TSMC. And since TSMC alone cannot make x64 chips (apart from AMD that is), then Apple might as well use their in-house ARM designs.
 
Skylake are first released on TouchBar 2016 15 inch MBP. Later on, snippet of ARM support was found in MacOS Sierra Beta.

Coincidence ?
There’s two issues.
First Apple builds their hardware, writes the Bios, and writes the OS. In PC land BIOS is outsourced and Microsoft exclusively writes Windows... so there’s plausible finger pointing on a lot of stuff.

Second, Apple was certainly already cross-compiling macOS on their own A processors several years ago. That’s going to bubble up a whole bunch of small bugs that an x64 only shop would just change their code, make it work now, and move on. Apple was trying to get their macOS to exactly cross-compile at binary compatibility for Catalyst at the time... so they were tracking their own bugs even more than Intel’s
 
No question, but people working at apple who are a lot smarter already were focusing on that topic and others. I really do trust that at this point they have all the major issues and hurdles addressed and resolved, otherwise they wouldn't have announced the shift. We're at a point where the engineers and executives are happy with what they have and its time to enter the next phase, rolling it out

I'd go further and say that in all likelihood that Apple has tested systems that are groundbreaking in terms of power alone, that made deciding on this shift easier for them to make. The A12X and Z have already proven themselves superior in some metrics against Apple's own MacBook's and that's not with the thermal or additional ram benefits that a laptop form factor provides.

And with improvements we've already seen with the A13 and likely soon to be released A14 chips, Apple is going to alter the personal computer landscape over the next few years, imho.
 
From Steve Jobs' biography:

At the high-performance end, Intel is the best. They build the fastest chip, if you don’t care about power and cost. But they build just the processor on one chip, so it takes a lot of other parts. Our A4 has the processor and the graphics, mobile operating system, and memory control all in the chip. We tried to help Intel, but they don’t listen much. We’ve been telling them for years that their graphics suck.

Every quarter we schedule a meeting with me and our top three guys and Paul Otellini. At the beginning, we were doing wonderful things together. They wanted this big joint project to do chips for future iPhones. There were two reasons we didn’t go with them. One was that they are just really slow. They’re like a steamship, not very flexible. We’re used to going pretty fast. Second is that we just didn’t want to teach them everything, which they could go and sell to our competitors.

According to Otellini, it would have made sense for the iPad to use Intel chips. The problem, he said, was that Apple and Intel couldn’t agree on price. Also, they disagreed on who would control the design. It was another example of Jobs’s desire, indeed compulsion, to control every aspect of a product, from the silicon to the flesh.


TBH if you take this account along with everything else that's happened, it really sounds like Intel has stopped innovating, period. AMD has certainly proved that and so has Apple with its A Series. At this point, Intel might be just too big to fail.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.