Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
LOL, you don't know anything about the semiconductor industry, do you?

Intel is the #2 semiconductor company in its own city of Santa Clara, California. Nvidia DWARFS Intel in market capitalization by something like $100 billion. Intel was founded in 1968. Nvidia was founded in 1993.

Intel's "human capital management issue" includes top-tier chip design engineers going to the competition on the other side of the street. While Intel is a moribund, listless shell living in the shadow of its former greatness, Nvidia is the rockstar, growing enormously with plenty of future opportunities.

Remember that AMD and Apple are in the bordering cities of Sunnyvale and Cupertino respectively. More reasons for Intel's brain drain.

Intel is under fire not just for their eroding marketshare of desktop computing but also for their increasing irrelevance elsewhere (mobile, handheld, cloud, HPC). What's the chance of a smartwatch having "Intel Inside"? Zero percent.
This is a really good point. It’s hard to say no to a job offer by your competitions headhunters when they offer you a significant increase in pay and more of a challenge when you don’t have to move or move far.
 
The 7 nm screwup is too big to hide. I think it's game over at this point. It's just about how quickly they disintegrate. I held INTC for a long time but sold in the middle of this year when the CEO said they were thinking of using TSMC. That basically sent the message of "we've screwed up so bad we have no idea how to fix it" or, in other words, the people that knew how to make it work are no longer at Intel.
 
And why some of the newer Ryzen designs are like hockey-pucks. I think the issue backward compatibility, intel can't start over like Apple did. Intel can't say, no more 32/16 bit apps. (I still decided a 10900K won out against a Ryzen in my own new build)
Why can’t the they draw a line under 32/16bit?
 
How much 16bit code are you still running on your machine? How many memory models are you utilizing (because you need them, not just because Intel forces boot into legacy modes before you can switch to a modern one.). Certainly Mac didn’t need any of that, but Mac was burdened with it. Are we all running slower and burning more power because a couple banks and government agencies are trying to keep their 16bit COBOL apps alive?
Agreed with the rest of your post, just wanted to bring up questions on this out of curiosity. Why is it that these banks and government agencies (and probably many legacy things like military modules, radars, car manufacturing machines, etc) still relying on 16bit COBOL apps NEED 2020s CPUs to run their 1980s apps? Why don’t they just use the latest available for that, Intel/AMD can continue providing those CPUs to replace those that might fall apart in a reduced replacement production style. I don’t know much about this, but with all the M1 news and info out there, it feels like everyday users are dragged by legacy transistors that are useless for them and legacy users are dragged and paying expensive modern transistor count that’s also useless for the use they need, a waste-waste situation.
 
so many intel haters
just wait when the apple arm does not satisfy digital content creators
you guys have a lot hope in apple that these m1 processors and all their apple proprietary GPU and other BS will meet your needs

I personally LOVE intel and building my own PC.
Every iMac I have ever owned has had a bad video card making it useless for graphic design or video editing.
Not quite Intel “haters”, we can all agree that Intel being phased out is bad for consumers, competition, jobs and industry as a whole. And the hate isn’t unwarranted... they just pump high TDP heat-producing chips and leave the OEMs to fend for themselves. Apple isn’t the only one suffering it, RAZR, Dell, Lenovo and all others have to deal with the same issues... and actually I think Apple did an amazing job at keeping at it (minus the latest Intel MacBook Air) so much so that even unplugged on battery power the benchmarks are exactly the same (and sometimes faster as it heats just a bit less without power delivery charge) while the other OEMs would sometimes have half the scores.

On a personal experience, I got nothing against Intel, iMacs or their cards. Heck, the imac has 16GB of VRAM and plenty of power on such a compact all-in-one form. It chews Blender, Unity, 3dsmax through parallels even. FCP, Motion, After Effects, Photoshop even more so... but there’s one thing that M1 and above are bringing to the table, pretty much like when GPUs came out and offloaded the CPU from doing graphics/vertex-transform/pixels etc rendering, Apple‘s M1 has TONS of GPU-like CPU offloading with ML cores, input/output dedicated chips, encryption/decryption/file-system dedicated chips, encoding/decoders dedicated chips (the main reason why it can handle insane codecs that even a Mac Pro can’t), so on and so forth all tight and close together...

I’m no expert and I might be wrong, but I think there’s something quite powerful going on here and I would wait and hesitate to call it ”BS” so early.
 
I don't give a rats if Intel fails to exist. Intel, together with Microsoft stifled advancement, destroyed competition, and kept the prices up for well over a decade. I hope greed will kill them off. I'm glad Apple will ditch Intel and I hope others will follow.
 
It's my understanding that:

(a) Xeon still very much dominates in the server market over Epyc/Opteron and that there's no imminent danger of that changing drastically anytime soon

(b) Non-Xeon consumer/prosumer x86 CPUs are only one of MANY different businesses and business units that Intel has (for instance, it has a fairly decent flash storage business last I checked; it also owns McAfee which isn't a small fish by any means)

(c) Given (b), the fact that people are freaking out about Intel going downhill based on only one of its many business units (albeit, its most popular one) seems a bit rash.
re b: Oct'20 earnings statement reads: Announced agreement to sell Intel NAND memory and storage business to SK hynix for $9.0 billion
McAfee was sold in Sep 2016
re a: yes, but more and more Arm is showing up in the server space, recently Microsoft but also Google, Amazon etc are working on their own, and there are the likes of Ampere starting to dig into this space
re c: pls read Intel's earnings announcements, not a whole lot of revenue outside of x86

Intel will continue to dominate x86 which is not going to go away any time soon, but their revenue has been mostly flat and it will further decline going forward ...
 
AMD has shown themselves to be much more agile. I would not be surprised if they led the PC market into a new age on a new architecture. I’m not sure Intel can be that company. The Itanium experience doesn’t bode well.

Unless x86 has a bunch of hidden potential we haven’t seen, I can’t imagine Intel will be the company to lead the next wave unless they buy a company who can and essentially transfer the name to a new company...
AMD has definitely benefited from going fabless. I do not see them as "leading the pack" to a new architecture, there's a lot of baggage and the OEM industry (HP, Dell etc) will not follow AMD. At the same token, Intel is not the one either, and neither is Microsoft, there is a lot of dirt and the industry is not going to follow the "Wintel" model moving forward.
Other than Apple, I don't see anyone shaking all of this lose, hence I believe x86 will continue to live longer than it should ...
 
Not quite Intel “haters”, we can all agree that Intel being phased out is bad for consumers, competition, jobs and industry as a whole. And the hate isn’t unwarranted...

I don’t agree at all that Intel going away would be bad for consumers, competition, jobs, or industry. Intel has had a tremendously anti-competitive effect on the industry for years, and has held the industry back for years. Since the early 1990’s, at least.
 
Except intel doesn’t compete.

well, they don't compete in x86 but all the other products they make, and where they do have to compete, they are not leaders ...
 
intel
take action on big improvements to graphics gpu
lower price of overclock capable cpu
get cracking on your wafer yields
i miss gordon moore at the helm
 
AMD has definitely benefited from going fabless. I do not see them as "leading the pack" to a new architecture, there's a lot of baggage and the OEM industry (HP, Dell etc) will not follow AMD. At the same token, Intel is not the one either, and neither is Microsoft, there is a lot of dirt and the industry is not going to follow the "Wintel" model moving forward.
Other than Apple, I don't see anyone shaking all of this lose, hence I believe x86 will continue to live longer than it should ...

None of the companies you mention will hitch their wagon to an upstart. If Intel had what it took to break with tradition and launch something new, those companies would love to stay with Intel. But Intel doesn’t have what it’s takes. Itanium didn’t fail because customers resisted change— HP codeveloped it, and big iron makers were eager to use it. It failed because Intel flat out failed to deliver and, I believe they failed to deliver because they’re culturally incapable of leaving x86 behind and introducing a new architecture.

AMD has the potential to be seen as a reliable vendor, who understands the x86 landscape, but can also be nimble without being anchored to the past. Remember, while Intel was sinking the Itanic, AMD was finding a much more pragmatic path to 64 bit and the industry embraced the AMD solution.
 
Last edited:
I think it has been their biggest asset in the past— it’s why that hot mess of an architecture is still the top selling PC processor.

I also think Intel believes it’s their biggest asset now. I’m less sure if that’s actually true, or if Intel just fell victim to their own marketing.

The big question is whether that asset suddenly became a massive liability. It may be too early to tell, but the performance of AS hints that this might be true. The fact that AS can run translated code nearly as fast or faster than the original x86 on a native processor suggests this might be true.

How much 16bit code are you still running on your machine? How many memory models are you utilizing (because you need them, not just because Intel forces boot into legacy modes before you can switch to a modern one.). Certainly Mac didn’t need any of that, but Mac was burdened with it. Are we all running slower and burning more power because a couple banks and government agencies are trying to keep their 16bit COBOL apps alive?

Ordinarily I’d say Intel knows their business better than I can guess it, but I’m starting to wonder if backwards compatibility went from customer need to religion at some point and stopped being questioned.
Me personally have a few dos games I still play, that I consider classics. But as noted above, emulation will not solve all issues with intel x86 architecture code. Deep and wide is the big issue and what made intel the powerhouse, which it still is.
 
  • Like
Reactions: DeepIn2U
I don’t agree at all that Intel going away would be bad for consumers, competition, jobs, or industry. Intel has had a tremendously anti-competitive effect on the industry for years, and has held the industry back for years. Since the early 1990’s, at least.
Can definitely agree on that, the bad practices and anticompetitive behavior could definitely be done for.
 
I've been testing various x86 apps under the insider previews of both Parallels and Windows 10 ARM and have been pleasantly surprised at how snappy everything is, and relatively reliable. The emulation actually works pretty damn well, albeit not as good as Rosetta. I think Microsoft could easily work out those kinks, and that along with exponentially better performance out of a new generation of ARM PC chips that can easily overcome the emulation overhead (as the M1 does) will easily move this whole thing along super fast. You do that, then you get demand for arm native ports of say, games up, and Intel is done. AMD will also likely move to arm, and is probably in a better position to do so than Intel
They don’t need to. They can give developers the ARM tools and universal binary packaging like Apple does. This is just a stopgap proof of concept mostly for older apps that won’t be updated.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.