The supposed demise of intel is bad news for everyone. Competition leads to better products.
The supposed demise of intel is bad news for everyone. Competition leads to better products.
This is a really good point. It’s hard to say no to a job offer by your competitions headhunters when they offer you a significant increase in pay and more of a challenge when you don’t have to move or move far.LOL, you don't know anything about the semiconductor industry, do you?
Intel is the #2 semiconductor company in its own city of Santa Clara, California. Nvidia DWARFS Intel in market capitalization by something like $100 billion. Intel was founded in 1968. Nvidia was founded in 1993.
Intel's "human capital management issue" includes top-tier chip design engineers going to the competition on the other side of the street. While Intel is a moribund, listless shell living in the shadow of its former greatness, Nvidia is the rockstar, growing enormously with plenty of future opportunities.
Remember that AMD and Apple are in the bordering cities of Sunnyvale and Cupertino respectively. More reasons for Intel's brain drain.
Intel is under fire not just for their eroding marketshare of desktop computing but also for their increasing irrelevance elsewhere (mobile, handheld, cloud, HPC). What's the chance of a smartwatch having "Intel Inside"? Zero percent.
Why can’t the they draw a line under 32/16bit?And why some of the newer Ryzen designs are like hockey-pucks. I think the issue backward compatibility, intel can't start over like Apple did. Intel can't say, no more 32/16 bit apps. (I still decided a 10900K won out against a Ryzen in my own new build)
Agreed with the rest of your post, just wanted to bring up questions on this out of curiosity. Why is it that these banks and government agencies (and probably many legacy things like military modules, radars, car manufacturing machines, etc) still relying on 16bit COBOL apps NEED 2020s CPUs to run their 1980s apps? Why don’t they just use the latest available for that, Intel/AMD can continue providing those CPUs to replace those that might fall apart in a reduced replacement production style. I don’t know much about this, but with all the M1 news and info out there, it feels like everyday users are dragged by legacy transistors that are useless for them and legacy users are dragged and paying expensive modern transistor count that’s also useless for the use they need, a waste-waste situation.How much 16bit code are you still running on your machine? How many memory models are you utilizing (because you need them, not just because Intel forces boot into legacy modes before you can switch to a modern one.). Certainly Mac didn’t need any of that, but Mac was burdened with it. Are we all running slower and burning more power because a couple banks and government agencies are trying to keep their 16bit COBOL apps alive?
Why can’t the they draw a line under 32/16bit?
Not quite Intel “haters”, we can all agree that Intel being phased out is bad for consumers, competition, jobs and industry as a whole. And the hate isn’t unwarranted... they just pump high TDP heat-producing chips and leave the OEMs to fend for themselves. Apple isn’t the only one suffering it, RAZR, Dell, Lenovo and all others have to deal with the same issues... and actually I think Apple did an amazing job at keeping at it (minus the latest Intel MacBook Air) so much so that even unplugged on battery power the benchmarks are exactly the same (and sometimes faster as it heats just a bit less without power delivery charge) while the other OEMs would sometimes have half the scores.so many intel haters
just wait when the apple arm does not satisfy digital content creators
you guys have a lot hope in apple that these m1 processors and all their apple proprietary GPU and other BS will meet your needs
I personally LOVE intel and building my own PC.
Every iMac I have ever owned has had a bad video card making it useless for graphic design or video editing.
re b: Oct'20 earnings statement reads: Announced agreement to sell Intel NAND memory and storage business to SK hynix for $9.0 billionIt's my understanding that:
(a) Xeon still very much dominates in the server market over Epyc/Opteron and that there's no imminent danger of that changing drastically anytime soon
(b) Non-Xeon consumer/prosumer x86 CPUs are only one of MANY different businesses and business units that Intel has (for instance, it has a fairly decent flash storage business last I checked; it also owns McAfee which isn't a small fish by any means)
(c) Given (b), the fact that people are freaking out about Intel going downhill based on only one of its many business units (albeit, its most popular one) seems a bit rash.
AMD has definitely benefited from going fabless. I do not see them as "leading the pack" to a new architecture, there's a lot of baggage and the OEM industry (HP, Dell etc) will not follow AMD. At the same token, Intel is not the one either, and neither is Microsoft, there is a lot of dirt and the industry is not going to follow the "Wintel" model moving forward.AMD has shown themselves to be much more agile. I would not be surprised if they led the PC market into a new age on a new architecture. I’m not sure Intel can be that company. The Itanium experience doesn’t bode well.
Unless x86 has a bunch of hidden potential we haven’t seen, I can’t imagine Intel will be the company to lead the next wave unless they buy a company who can and essentially transfer the name to a new company...
Not quite Intel “haters”, we can all agree that Intel being phased out is bad for consumers, competition, jobs and industry as a whole. And the hate isn’t unwarranted...
AMD has definitely benefited from going fabless. I do not see them as "leading the pack" to a new architecture, there's a lot of baggage and the OEM industry (HP, Dell etc) will not follow AMD.
well, they don't compete in x86 but all the other products they make, and where they do have to compete, they are not leaders ...Except intel doesn’t compete.
![]()
Advanced Micro Devices, Inc. v. Intel Corp. - Wikipedia
en.wikipedia.org
![]()
Intel Is Still Fighting the EU Over Its Anti-Competitive Actions Against AMD - ExtremeTech
Intel was hit with a $1.49B fine back in 2009 for anti-competitive practices. Eleven years later, the company is still fighting the verdict.www.extremetech.com
point taken, it still does not change my mind that I do not see them leading to a new architecture, away from x86That’s what everyone said when we invented AMD64, too. How’d that turn out?
I highly doubt that regulators would allow that.They can buy AMD to begin with.
AMD has definitely benefited from going fabless. I do not see them as "leading the pack" to a new architecture, there's a lot of baggage and the OEM industry (HP, Dell etc) will not follow AMD. At the same token, Intel is not the one either, and neither is Microsoft, there is a lot of dirt and the industry is not going to follow the "Wintel" model moving forward.
Other than Apple, I don't see anyone shaking all of this lose, hence I believe x86 will continue to live longer than it should ...
Me personally have a few dos games I still play, that I consider classics. But as noted above, emulation will not solve all issues with intel x86 architecture code. Deep and wide is the big issue and what made intel the powerhouse, which it still is.I think it has been their biggest asset in the past— it’s why that hot mess of an architecture is still the top selling PC processor.
I also think Intel believes it’s their biggest asset now. I’m less sure if that’s actually true, or if Intel just fell victim to their own marketing.
The big question is whether that asset suddenly became a massive liability. It may be too early to tell, but the performance of AS hints that this might be true. The fact that AS can run translated code nearly as fast or faster than the original x86 on a native processor suggests this might be true.
How much 16bit code are you still running on your machine? How many memory models are you utilizing (because you need them, not just because Intel forces boot into legacy modes before you can switch to a modern one.). Certainly Mac didn’t need any of that, but Mac was burdened with it. Are we all running slower and burning more power because a couple banks and government agencies are trying to keep their 16bit COBOL apps alive?
Ordinarily I’d say Intel knows their business better than I can guess it, but I’m starting to wonder if backwards compatibility went from customer need to religion at some point and stopped being questioned.
Can definitely agree on that, the bad practices and anticompetitive behavior could definitely be done for.I don’t agree at all that Intel going away would be bad for consumers, competition, jobs, or industry. Intel has had a tremendously anti-competitive effect on the industry for years, and has held the industry back for years. Since the early 1990’s, at least.
They don’t need to. They can give developers the ARM tools and universal binary packaging like Apple does. This is just a stopgap proof of concept mostly for older apps that won’t be updated.I've been testing various x86 apps under the insider previews of both Parallels and Windows 10 ARM and have been pleasantly surprised at how snappy everything is, and relatively reliable. The emulation actually works pretty damn well, albeit not as good as Rosetta. I think Microsoft could easily work out those kinks, and that along with exponentially better performance out of a new generation of ARM PC chips that can easily overcome the emulation overhead (as the M1 does) will easily move this whole thing along super fast. You do that, then you get demand for arm native ports of say, games up, and Intel is done. AMD will also likely move to arm, and is probably in a better position to do so than Intel