Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Why are so many of you hellbent that Intel folds? Is your fanboyism so great that you actually want a monopoly? Do you not remember what lack of competition looked like for Intel? Consumers lost for years.

You really want that to happen again?
There are three competitors in this future - Nvidia with ARM, AMD with x86 and ARM, and Apple with M/ARM. How is it fanboyism to want to see Intel go away? Companies like that one are the absolute worst thing that ever happened to this industry and you're actually proving it in this example...

So when you ask why so many are hellbent, you answered your own question.
 
There is no assumption here. Apple literally has a diversity score that it uses to rate its hiring process. This is public information. They openly have offensive hiring policies that they deliberately engage in. They are doing everything they can to hire people based on race, gender, and sexual identity. The more marginalized the better, as it ticks more notches on their diversity score. Don't believe me? Google it. This disgusting crap is actually real.

And no one can hide the fact that this effort at Apple has paralleled their decline in quality.
This happens across the entire industry, not just at Apple. And it's been happening at Apple since the 90's...so how exactly has it paralleled a decline in quality NOW?
 
  • Like
Reactions: Argoduck
There is no assumption here. Apple literally has a diversity score that it uses to rate its hiring process. This is public information. They openly have offensive hiring policies that they deliberately engage in. They are doing everything they can to hire people based on race, gender, and sexual identity. The more marginalized the better, as it ticks more notches on their diversity score. Don't believe me? Google it. This disgusting crap is actually real.

And no one can hide the fact that this effort at Apple has paralleled their decline in quality.

Apple stocks are worth more than they have ever been in history and the company has around 200 billion dollars cash on hand but go off. I'm sure the video you saw on tiktok about white genocide wouldn't lie to you.

Literally none of the hiring process has anything to do with hiring unqualified people. Right-wing internet has rotted your brain mate. Better take some Hydroxychloroquine!
 
I feel sorry for those who purchased the maxed out Mac Pro's (2019). This same thing happened around the Power Mac G5 days. Steve Jobs sold us on how G5 was the future and it will be massively supported... barely a few years in, switched to Intel, that machine had the shortest useable lifespan of all my Macs. These Apple Silicon beasts will DESTROY the Mac Pro 2019. I can't wait for one of those in my studio!
Why? It was a machine for pros, making money with their computer. If they were really pros, then their machines will have gotten them 3 good years, and when the time comes for an upgrade, it'll presumably be a nice bump. If their work shows a benefit in moving to one of the M1 Max or Pro chips now, then they'll still have gotten 2 good years of revenue from their machines. Now if you're talking about a hobbyist who really wasn't the market for the Mac Pro but they plunked down the money anyway... once again, it's not like the ARM Mac Pros came out right after they made their purchase. In fact, they're still a year away. So your comparison is silly. Future chips rule. But current chips is how a pro actually earns his or her living. It's not about fanboyism. It's about doing your job right now with the tools available.
 
It's not Macs or M1 that will lead to the end of Intel's marketshare - but M1 is what is driving AMD to an M1 competitor...and behind the scenes, Nvidia is brewing an entire line of ARM competitors to Intel on every front.
Imagine Chomrebooks with MacBook Air level performance. ARM-based servers that can do more than low-end virtual web hosting. Integrated graphics that nearly always outperform dedicated video cards. Intel’s problems won’t be losing some specific fringe markets. It will be competitors burning holes through it on many fronts like swiss cheese. Apple put a “phone chip” in computers that previously had top-of-the-line Intel processors and they got better in every measurable way. That did not go unnoticed in any corner of the industry.
 
Why are so many of you hellbent that Intel folds? Is your fanboyism so great that you actually want a monopoly? Do you not remember what lack of competition looked like for Intel? Consumers lost for years.

You really want that to happen again?
That's the first time I've been called an Apple fanboy. If you check my post history you'll see I'm often very critical of the company when the situation warrants, which recently has been quite often.

And there is no shortage of competition. AMD is a very strong competitor.
 
Apple stocks are worth more than they have ever been in history and the company has around 200 billion dollars cash on hand but go off. I'm sure the video you saw on tiktok about white genocide wouldn't lie to you.

Literally none of the hiring process has anything to do with hiring unqualified people. Right-wing internet has rotted your brain mate. Better take some Hydroxychloroquine!
Ok. I didn't mention any of those things, you did. I'm citing Apple's published information.
 
  • Like
Reactions: foliovision
This happens across the entire industry, not just at Apple. And it's been happening at Apple since the 90's...so how exactly has it paralleled a decline in quality NOW?
Well this is untrue. Apple's current diversity program only started a few years ago. This is what I'm referring to.
 
It was ages ago and frankly no one really cares. Workstation users buy the 1TB or so with the machine and warranty, customers buy what they need. By the time you need more RAM there is a new CPU and other things you want/need.
That's not always the case. I can certainly imagine a research group buying a computer with a certain amount of RAM, where the next year their research takes them a new direction where more RAM is needed. That happened to me when I was a grad student. Fortunately, at the time, the RAM in my PowerMac G5 was upgradeable.
 
  • Like
Reactions: Argoduck
We've now entered a news cycle where awesome Mac rumors are once again the norm.
Honestly I don't think it's been this good since the early G4 days, and even then Apple didn't have their own chip architecture. Good times to be a Mac and Apple user!
 
  • Like
Reactions: neilw
M1, M1 Pro, M1 Max, M1 Max Plus, M1 Extreme, M1 Extreme Plus, M1 Ultra, M1 Ultra Plus, M1 Mega, M1 Mega Plus, M1Hyper, M1 Hyper Plus...
 
  • Haha
Reactions: foliovision
There are three competitors in this future - Nvidia with ARM, AMD with x86 and ARM, and Apple with M/ARM. How is it fanboyism to want to see Intel go away? Companies like that one are the absolute worst thing that ever happened to this industry and you're actually proving it in this example...

So when you ask why so many are hellbent, you answered your own question.
Since when does Nvidia make CPUs? If you got rid of Intel you'd only have ONE company making x86 processors for Windows, Linux etc.

Have you really thought this through?
 
40 compute cores in a laptop...holy fertilizer batman. So much for my 2021 being a 5 year machine.
 
Well this is untrue. Apple's current diversity program only started a few years ago. This is what I'm referring to.
They were doing it anyway. As were others. It merely got this industry-wide label because the practice had become topical a few years ago.

Proclaiming ‘diversity’ is no risk to a company when there’s an abundant pool of well-qualified people from which to draw. One could call it ’safe’ to do so at that point…and even a little cynical.
 
I’m waiting for the second or third generation of these chips before buying. If the third gen is going to be this good then that decides it for me. I can wait until autumn 2023. I will get the M3 Max 14” MBP with 128GB RAM and 8TB SSD (which will probably sit at a lower price just below the 16TB model by then) and just use that as my desktop docked and laptop computer for 8 years or until I break it.

I’m hoping by then it has a micro LED display as well. I feel like OLED won’t last as long as I want it to with image retention issues that are more inherent to a desktop OS. Linus of Linus Tech Tips on YouTube has been using a smaller LG OLED TV as his primary display for the last year and it already has some burn-in issues that he outlined in a video. But I’ve been using the larger size as my TV for the past year and a half and it doesn’t have the same issues, nor do things like my iPhone after a year before I upgrade. And I think it’s because those devices don’t have consistent UI elements that are always in the same place, and for the iPhone it can be held in multiple orientations and is used for video a lot and sometimes the status bar at the top is different colors so it’s not going to burn in.
 
  • Like
Reactions: foliovision
That's not always the case. I can certainly imagine a research group buying a computer with a certain amount of RAM, where the next year their research takes them a new direction where more RAM is needed. That happened to me when I was a grad student. Fortunately, at the time, the RAM in my PowerMac G5 was upgradeable.

The days of users putting off the shelf chips in boxes are coming to a close. These are legacy paradigms from the early days of computing that probably should have died a long time ago, anyway. I believe in this decade we will no longer be able to increase the performance of a computer with anything put in an expansion card slot. We may be able to add functionality to a computer, but never match or exceed the performance of integrated components. We may be on the verge of RAM and Storage not being fundamentally different enough in performance or function to justify making a distinction in specs. If you have more stuff to do, you just buy a chip with more storage. Right now we have swap disk that works as fast as ram did just a couple generations ago. And Ram faster than level2 cache was a generation ago. It’s only going to get better.
 
  • Like
Reactions: Argoduck
The installed base of Intel chips is too massive to declare game over.
Yesterday’s chip sales don’t pay tomorrow‘s bills. To see which way the winds blow we will need to watch Microsoft. If MS still values the WIntel market alliance and is ok with being a relative performance laggard for a few years then Intel will have breathing room to retool and get back in the game. But if Microsft starts embracing ARM, even casually at first, it will be catastrophic for Intel. When Dell ships their first ARM based laptop with an NVidia CPU with NVidia GPU cores in an SOC running Windows, I wouldn’t want to be holding any Intel stock in my portfolio.
 
  • Like
Reactions: singhs.apps
That doesn't make ANY sense !

They've already over-shot their existing market by a wide margin !

The M1 Pro & M1 Max are over-kill for 99.9% of Apple's existing customer base !

I suspect what we're hearing about now is the 911 GT3 version, good for press coverage, but NOT a volume driver !
It will be in the future $50K Mac Pro
 
In my experience, there is no amount of power that is ever too much for software developers.
It's so true. Just when we have enough power for 4K workflows, bam, 8K enters the scene (4x the data as 4K, 16x HD). And on and on it goes...
 
It's so true. Just when we have enough power for 4K workflows, bam, 8K enters the scene (4x the data as 4K, 16x HD). And on and on it goes...

8 and 16K workflows even serve to make better 4K outputs. So even if your deliverable specs haven’t changed, you can now deliver better quality.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.