Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I remember the switch from 32-bit processors to 64-bit processor being a big deal. Why has there not been a switch to 128-bit processors? Maybe not everything, but why are Xeons, and other similar type processors, not moving to 128-bit?
 
Maybe Apple is secretly working on a plan to transition macOS to Apple ARM processors.

Moving to X86 was _THE_ deciding factor in me switching to the Mac camp. As a developer, being able to run Windows and Linux in Parallels for testing is monumental. If Apple moved laptops to ARM, I'd either have to buy a supplemental Windows laptop or give up Mac all together. I also still play a ~15 year old MMORPG in Windows under Parallels.
 
  • Like
Reactions: vmistery
lets be honest here guys, the general population don't need more computer power like they used to. And we are simply seeing this being reflected in both processor and GPU manufactures. Sure there will always be folks who do video editing or intense multi monitor gaming that would like more powerful products, but i would say those are less than 5% of the addressable market.
 
Actually there is a big difference-- the latter has SSDs. Which brings up the real point-- the main new product differential isn't the processor any more, it's the peripherals.

My 2009 i7 has a hand-built fusion drive. The only time I notice a performance difference is when I transcode w/handbrake; her mac is substantially faster.
 
  • Like
Reactions: Brian33
Apple's A9x core is already there. It's as fast as a Skylake chip for tablet CPUs.

Bump up the cache, clock frequency, core counts, power-budget, maybe add some HBM, and we're pretty much set for the next Macbook Pros. The only issue is adding x86-64 compatibility modes for legacy or virtualization apps.

which is a rather huge issue isnt it?
 
  • Like
Reactions: Borin and vmistery
I'll still be patient and take an Intel CPU over one of Apples A-chip offerings.

But I have a feeling Apple are going to do something crazy and jump ship.
 
There will still new processors every year, it just means that instead of having a new architecture (and socket) every other year, there will be two rounds of refinement instead of one. Call it whatever you want, slowing down or otherwise, but there are plenty of improvements to be made to memory bandwidth, power usage, integrated GPUs, etc.

What this does not have anything to do with, is how Apple is severely late at adopting Skylake across the board on their notebooks - and that's not Intel's fault.

Apple will switch desktops and notebooks to A-series ARM chips instead? Yeah, let's hope they don't try to pull that sort of dumb move.
 
  • Like
Reactions: vmistery
I remember the switch from 32-bit processors to 64-bit processor being a big deal. Why has there not been a switch to 128-bit processors? Maybe not everything, but why are Xeons, and other similar type processors, not moving to 128-bit?

Bits don't make a machine faster. As a matter of fact, more bits require more memory and power to work with. The upgrade to 64-bit was required because 32-bit was reaching serious limitations on it's ability to address memory/storage. 32-bit systems can only use about 4GB of memory without resorting to tricks. 64-bit systems can use about 16 EXABYTES! It's going to be a long time until we _need_ 128-bit systems to address more space.
 
Man, the update cycles for Macs are awful.
MacPro and Mac Mini are completely forgotten. MBP, iMacs and MacBooks are a bit better but still pretty long.
Just look at the Mac Buyer's guide on MR and it is pretty depressing.

Its brutal and concerning. The Mac Pro is in need of some love. And I don't mean brand new bands like they did for the Apple Watch!
 
Man, the update cycles for Macs are awful. Just look at the Mac Buyer's Guide on MR and it is pretty depressing.
You do realize the buyer's guide is designed to show yellow and red flags half the time? If the cycles were shorter, we would only reach mid-cycle sooner. In the spring time the guide is always all read.
 



In its latest 10-K annual report (PDF) filed last month, Intel confirmed the end of its long-heralded "tick-tock" strategy of delivering new microprocessors to the market. Intel originally introduced the product cadence to the world in 2006 with the launch of the "Core" microarchitecture, alternating "ticks" of shrinking chip fabrication processes with "tocks" of new architectures.

Over the past ten years, Intel has successively delivered new processor families based on this tick-tock cycle on a nearly annual cycle from its 65 nm manufacturing node all the way up until recently. The tick-tock release cycle allowed Intel to reestablish dominance in both the consumer and enterprise CPU markets and had given OEMs such as Apple a regular update cycle to rely on for annual product updates. But with chip updates stretching about beyond a yearly cycle in recent generations, Apple's product launch cycles have started to be affected.

In the face of the difficulties in maintaining the tick-tock cadence, Intel has announced that the launch of Kaby Lake this year as the third member of the 14-nm family following Broadwell and Skylake will mark the official end of the tick-tock strategy. Instead, Intel will move to a new "Process-Architecture-Optimization" model for the current 14 nm node and the 10 nm node.

Tick-tock-dead-1.png
This development is not unexpected, as semiconductor foundries have had increasingly tough times creating smaller process nodes as fabrication of smaller transistors has become increasingly expensive and complex. Transistors are rapidly approaching the physical limits of traditional semiconductor geometries, and the famous Moore's Law regarding transistor density has been formally acknowledged to no longer be valid.

Intel has no doubt moved to this new release model in an attempt to get back to a regular product and platform cadence as it struggles with the technological challenges of bringing new fabrication nodes to volume production. As noted in our Mac Buyer's Guide, many of Apple's Macs have gone without update for the longest time since we began tracking them, though Apple has yet to update to the available Skylake microarchitecture for its Mac line. Some product uncertainty is due to continue as the launch of Intel's Kaby Lake microarchitecture has been recently delayed to the second half of 2016 after Skylake suffered similar setbacks last year.

Article Link: Mac Update Cycle Faces Uncertainty as Intel Abandons Tick-Tock Strategy
 
Omg, so many amoebas are talking things they do not understand, or have the slightest idea what they are. ARM minions are puffing on 13" Chromebooks, and INTC just whoop their buts with the Celeron line. Btw that line was created around 2000. Can you imagine that? The ARM clowns have been in business, what, the past 25-30 years, and they have no game in the servers. So AAPL should kneel and beg for mercy, otherwise INTC might let those amoebas stay in the middle ages for another 10 years.
 
Apple likes to have someone to blame when things go wrong. If they get invloved in CPU design/fabrication the buck will stop with them.
 
Exactly. It's not as if Apple owns a fab. They would be beholden to TSMC and Samsung to make their ARM chips. TSMC is not going to get as good at fabrication as Intel; certainly not without x86-like volume.
It's funny how some posters here believed TSMC 'Leaks' that they will have 10nm SOC's ready for the fall A10 release.... If Intel can't mass produce cpu's yet at 10nm, somehow TSMC is going to beat them? Fuhggeddaboutit!
The A10 will be built on existing 14/16nm silicon, just like the A9/X. It will be interesting to see how much Apple can squeeze out of AX chip performance without the luxury of a new process node...

Also, ARM is great at producing high chip power with little power consumption, that's why the A9X can trade blows with the pathetic/anemic Core M (Intel really gimped that chip so it can run without a fan). Scaling an ARM chip up to the 35-45 watt level to compete with Skylake at 14nm hasn't happened yet and I'm not holding my breath...
 
  • Like
Reactions: rbrian
Maybe Apple is secretly working on a plan to transition macOS to Apple ARM processors.

I would guess that they already have working copies of OS X (soon to be macOS?) already running in ARM in Apple R&D. I would also guess that was one of the motivations on iOS hardware/software going 64-bit a couple of years back.
 
I wonder how Apple feels about the change from 2 step to 3 step process. Didn't they switch away from PowerPC chips because the cycles were slow? Would they switch over to A9X type chips or develop their own at some point???

As other's have mentioned, the lack of competition has made Intel boring.
 
  • Like
Reactions: AleXXXa
There are over 3 billion people with computers (internet accessing devices) in this world. Source

There are less than 20 million people that would be editing code. Source

I think that 99% of the 3 billion don't need to be coding on iOS. Most of them just need email, word editor, spreadsheet, browser, movies, pictures, music, apps. And a large portion of those people only need basic capabilities in each of the categories i just mentioned. We need to stop looking at the world as if everyone is a techno-geek like most on MR. The truth is we are in the significant minority.

Now where is your paycheck :D

This is pretty spot on. If you know how to code well, you will always find work. Of the 20 million that are "editing code" the numbers get quite thin when you start to look at specialties. At any one moment, there are usually only a few hundred to thousand actively working in a specific field. This is why freelancing in software development is so lucrative.

One of your best negotiating points is your ability to find another job or partnership since you are in demand. One old-salt of the valley told it to me best, "Software has, was and always will be maverick's game. The cost of writing code is so low and with the First Amendment and Work at Will in the USA, it takes a lot to keep you from working."

While there are diploma mills and quick degree universities pumping out engineering degrees to foreign workers on an H1-B, they lack two essential skills of a good software developer -- freedom and imagination.
 
FINALLY a grammatically correct, fits-in-context example of the phrase "for the longest time"
 
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.