Intel Urged to Take 'Immediate Action' Amid Threats From Apple Silicon and AMD

Well some organisation(s) must believe there is talent in Intel as their people are being head hunted all over the place. Their proprietary Intel speak does not mean that underneath that there isn't a talented designed from a conceptual standpoint. It could be a case of their creativity being absolutely smothered by middle management who are resistant to change because "that's the way we have always done things".
You saying that “their people are being head hunted all over the place” does not make it true. We certainly never hired anyone from there, and we were their biggest (and arguably only) competitor. Nor did we hire folks from there at the other CPU companies I worked at.
 
I think this shows a lack of foresight. Although no one yet quite knows how to best exploit FPGA technology in mainstream computing, they really do offer an awful lot of potential.
Afterburner?

I think the failed assumption is “mainstream”. If it were mainstream, you’d use dedicated logic because it’s faster, denser and more power efficient. FPGAs by nature are best for niche applications.
 
Afterburner?

I think the failed assumption is “mainstream”. If it were mainstream, you’d use dedicated logic because it’s faster, denser and more power efficient. FPGAs by nature are best for niche applications.

Agreed. FPGA seldom makes much sense in production products. It’s most useful where volume is very low, or where you are prototyping or performing logic verification. Most scenarios where reprogramming the logic blocks in the field is useful would be equally easy to do with custom ASICs and firmware updates.

There is a class of algorithms that benefit from dynamic FPGA reconfigurations (during calculations), but it’s a pretty small class.
 
Agreed. FPGA seldom makes much sense in production products. It’s most useful where volume is very low, or where you are prototyping or performing logic verification. Most scenarios where reprogramming the logic blocks in the field is useful would be equally easy to do with custom ASICs and firmware updates.

There is a class of algorithms that benefit from dynamic FPGA reconfigurations (during calculations), but it’s a pretty small class.

Yeah. There are applications where accelerating by FPGA is more efficient than by GPU, for example, but not so many that it makes sense to put an FPGA on the motherboard or embed it in the CPU package. You'd do with an accelerator card just like most do with the GPU-- and the GPU has the advantage of being useful even if you're not doing general purpose computing on it.

For most stuff, the Apple approach of dedicated silicon for high value tasks makes much more sense. When GPUs weren't quite good enough for neural acceleration, for example, implement a dedicated neural processor.

I think Altera was another one of Intel's "we roolz the process" acquisitions. FPGAs rely on cutting edge process to stay competitive in power, and Intel probably still thought they had that. Meanwhile, Xilinx (now part of AMD) is fabbed by TSMC.

I'm noticing a pattern...

My guess is Intel intended to take the same approach they did to GPUs and wed them to their CPUs. AMD seems perfectly comfortable shipping GPUs as expansion cards, whether you're using an AMD processor or not, and are probably intending to follow the same approach with FPGAs to build on Xilinx's move into data centers.
 
While the TDP is indeed the same, the actual power draw variance of recent Intel 14nm generations has gone up significantly — they try to cool down more, but they also heat up more when performance demands it.

Apple’s case design isn’t ideal for that, but also, recent Intel generations are simply very power-inefficient.
Agreed. Both Intel and Apple are complicit.
 
I think the failed assumption is “mainstream”. If it were mainstream, you’d use dedicated logic because it’s faster, denser and more power efficient. FPGAs by nature are best for niche applications.

By mainstream I meant beyond their use in prototyping for things like ASICs etc. Perhaps a better choice of words would be “commercial computing”
Agreed. FPGA seldom makes much sense in production products.

You both can help me then because I’ve been trying to work out why the biggest and second biggest manufacturers of desktop/server CPUs have spent billions acquiring two of the biggest producers of FPGAs? Buying these companies just to sell FPGAs doesn’t seem to make much sense. Do they intend to use or integrate this IP with their existing technologies to produce a commercial product?

Other than their obvious benefit of being infinitely reconfigurable, aren’t they also very capable with regards to massive parallelisation?

For Intel at least, it seems to be data center applications:
 
Last edited:
By mainstream I meant beyond their use in prototyping for things like ASICs etc.


You both can help me then because I’ve been trying to work out why the biggest and second biggest manufacturers of desktop/server CPUs have spent billions acquiring two of the biggest producers of FPGAs?

Other than their obvious benefit of being infinitely reconfigurable, aren’t they also very capable with regards to massive parallelisation? It seems to be data center applications:

Presumably they bought them to diversify.

There’s nothing about FPGAs that make them good for parallalization. FPGAs are just, at their heart, an array of combinatorial logic blocks, arranged in a grid. Each block can be assigned a function, but the functions are pretty simple. Like AND-OR, or 6-input MUX. in order to do this, by definition, once the FPGA is programmed there will be, within each CLB, transistors that are doing nothing to further the computations. They are just there to support the reconfigurability.

anything that can be done with an FPGA can be done with an ASIC. The only advantage of a fpga is you can quickly program one (much more quickly than fabbing an ASIC). And for a few algorithms you might dynamically change them on the fly, but that’s very few algorithms.
 
Last edited:
this article today suggests the intel will reveal their longterm manufacturing plans oat earnings announcement on 1/21


 
this article today suggests the intel will reveal their longterm manufacturing plans oat earnings announcement on 1/21


Hopefully this time it will be non-fiction? Naahhhh, likely more fiction :)
 
What does this failure say about the American union-controlled education system? especially the total lack of viewpoint diversity in universities.
I remember when Israeli engineers saved Intel from technological obsolescence in 2007. Intel was stuck in a mindset with no ability for creative solution-making. The Israelis, such as Ron Friedman, were placed in charge of worldwide design operations. https://www.seattletimes.com/business/how-israel-saved-intel/
 
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.
Back
Top