Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
AMD didn’t buy nuvia. Qualcomm did.

And nuvia doesn’t have “the” former Apple silicon designer. It has a couple of people from the team. One of whom used to work for me at AMD for a project or two, and with me on a project or two. Nuvia has no chips that actually exist so we have no idea how fast their chip would have been.
Yes Qualcomm. I got them mixed up. But it certainly is headed by a former Apple chip designer and they have provided information detailing that their CPU stats.
 
Yes Qualcomm. I got them mixed up. But it certainly is headed by a former Apple chip designer and they have provided information detailing that their CPU stats.
Where is that information? I haven’t seen any numbers claimed for any of their chips, or even any claim that they’ve even made an actual chip. They don’t seem to claim that on their own website, unless I am missing something.

And, yes, they were designers at apple. On a team with a hundred cpu designers. So what? Lots of companies have former Apple chip designers. And many of those Apple chip designers came from AMD, pa semiconductor, dec, intrinsity, etc. And those folks have also gone all over the industry. Does that mean everyone is as good as Apple?
 
What comes after 1nm?

picometers
then
femtometers
next
attometers
and even smaller
zeptometers
yet even smaller
yoctometers



Screenshot 2021-01-15 at 16.22.33.png
 
This really blows my mind. Early in my career with the DoD (1980's) I worked with many of the US chip makers that supplied DoD and NASA (National Semi, TI, Raytheon, Hughes, TRW)....most of which don't exist (or at least don't make chips) any more. Features were measured in microns, not nano. Packaging and materials were a large consideration because the circuits were getting so small that cosmic rays could flip RAM bits. A lot of error correction was built in to try to correct for flipped bits. My career pivoted to IT when the US chip business dried up and I haven't kept up with the industry any more, but it's amazing that circuits that small work any more given the energies of cosmic ray particles - makes me wonder how they're protecting chips nowadays.
 
  • Like
Reactions: Captain Trips
This really blows my mind. Early in my career with the DoD (1980's) I worked with many of the US chip makers that supplied DoD and NASA (National Semi, TI, Raytheon, Hughes, TRW)....most of which don't exist (or at least don't make chips) any more. Features were measured in microns, not nano. Packaging and materials were a large consideration because the circuits were getting so small that cosmic rays could flip RAM bits. A lot of error correction was built in to try to correct for flipped bits. My career pivoted to IT when the US chip business dried up and I haven't kept up with the industry any more, but it's amazing that circuits that small work any more given the energies of cosmic ray particles - makes me wonder how they're protecting chips nowadays.

Mostly just ECC. If you don’t have ECC RAM, you take your chances. The memory on-chip is generally static, not dynamic, so it’s not as likely to be flipped by a stray cosmic ray, but often times there is error correction built in anyway (for caches, not for any register files that I’m aware of).
 
  • Like
Reactions: Captain Trips
Mostly just ECC. If you don’t have ECC RAM, you take your chances. The memory on-chip is generally static, not dynamic, so it’s not as likely to be flipped by a stray cosmic ray, but often times there is error correction built in anyway (for caches, not for any register files that I’m aware of).
That's what blows my mind (no ECC for registers). What used to be the size of a single bit is now hundreds (thousands?) of transistors susceptible to getting hit all at once.
 
That's what blows my mind (no ECC for registers). What used to be the size of a single bit is now hundreds (thousands?) of transistors susceptible to getting hit all at once.

The register bits are static RAM, so they are more difficult to flip. They are also typically wired in differential fashion - so if a ray hits a wire, it hits the negative polarity version of that wire as well. But since it‘s the difference in voltage between the wires that matters, and since both wires have their voltage modified in the same direction, it doesn’t affect how the value is interpreted.
 
Last edited:
Already mentioned ijn another thread, but hey, again:


not gonna protect Intel for „slouching“ around vs the competition, but could please EVERYONE here going like : „duh Intel is so behind - they are still on 14nm++++++ - they must be the dum.... company on the planet“ please back off.

7 vs 10 vs 14 nm has NOTHING to do wit Transistor size (anymore). It is simply just a remnant of earlier times and the „nm“ term is only used to name generational jumps/improvements. Whether it is a declining number (14-10-7-5-3....) or gen 1,2,3,4,5, or just simply adding (those friggin ) „++++“ -it simply doesn‘t matter anymore.

For clarification: Der 8auer actually took Intel and AMD under „a“ microscope - and guess what: Size and evolution DOES not really differ - so please bash Intel otherwise/correctly.

Video part 1:
Video part 2:
Video part 3:

Enjoy learning about „14“ vs „7“ nm
Kind of missing the point. No one really disputes that Intel 14 nm is about equivalent to TSMC 10 nm. The problem for Intel is that they still haven't finished their 10 nm rollout and TSMC is now talking about 3 nm late next year. The relative sizes of the features on the chip is what matters, not the marketing designation.

For reference, transistor density per mm squared:

TSMC 10 nm = 52.5 MTr/mm²
TSMC 7 nm = 96.5 MTr/mm²
TSMC 7+ nm = 113.9 MTr/mm²
TSMC 5 nm = 171.3 MTr/mm²

Intel 14 nm = 37.2 MTr/mm²
Intel 10 nm = 100.8 MTr/mm²
Intel 7 nm hasn't been published as far as I can tell though Intel has previously claimed a 2x improvement. Take that as you will.

You can see from the transistor density that Intel's 10 nm is about equivalent to TSMC's 7nm range but it can't touch TSMC's 5 nm much less their 3 nm. Intel's 14 nm is not even remotely competitive on feature size. No matter how you slice it up, Intel is very far behind. Many years behind.
 
Last edited:
That chef didn’t do anything! She just bought a fancy oven!
That photographer wasn’t innovative - the credit goes to Nikon!
Straying off topic, the second statement (or an equivalent) is actually quite common, and it's something that frequently annoys photographers. If a chef prepares an amazing meal, nobody says "You must have a really nice oven." They compliment the chef. But if a photographer shows an amazing photograph, quite often the first comment is "Wow, you must have a really good camera."

Back on topic, your posts in this thread have been very interesting and informative, so thanks!
 
As it stands right now, it seems with existing technology and known physics, it’s doubtful we will get past 3nm without facing major problems with quantum tunneling.

Here‘s an article from 2012 pointing out that for years PRIOR to that, people had been predicting the end of Moore’s law due to quantum tunneling: https://computer.howstuffworks.com/moores-law-outdated.htm

Here’s one from 2010: https://open.lib.umn.edu/informationsystems/chapter/5-2-the-death-of-moores-law/

In 1996, my Ph.D. advisor told me we would be doomed by 2000, which is why we should switch to HBT transistors right away.


Just like every year is going to be the year of Linux on the desktop, every year is going to be the end of Moore’s Law due to quantum tunneling.
 
Where is that information? I haven’t seen any numbers claimed for any of their chips, or even any claim that they’ve even made an actual chip. They don’t seem to claim that on their own website, unless I am missing something.

And, yes, they were designers at apple. On a team with a hundred cpu designers. So what? Lots of companies have former Apple chip designers. And many of those Apple chip designers came from AMD, pa semiconductor, dec, intrinsity, etc. And those folks have also gone all over the industry. Does that mean everyone is as good as Apple?
Nuvia was founded in part by Ex-Apple CPU design chief Gerard Williams. He wanted Apple to try it’s hand at servers, and when it seemed Apple was not interested, he left Apple and formed Nuvia.
Nuvia produced a spec sheet last year that looked very similar to Apple’s M1 in terms of speed and power efficiency.
There is no reason to doubt their claims.

Apple has filled a lawsuit against Gerard Williams with claims that the chip designer founded his new company, Nuvia, while still working for Apple. It accuses Williams of going against the anti-competitive clauses in his contract before leaving the company, and also of then recruiting his former colleagues.
 
Nuvia was founded in part by Ex-Apple CPU design chief Gerard Williams. He wanted Apple to try it’s hand at servers, and when it seemed Apple was not interested, he left Apple and formed Nuvia.
Nuvia produced a spec sheet last year that looked very similar to Apple’s M1 in terms of speed and power efficiency.

Apple has filled a lawsuit against Gerard Williams with claims that the chip designer founded his new company, Nuvia, while still working for Apple. It accuses Williams of going against the anti-competitive clauses in his contract before leaving the company, and also of then recruiting his former colleagues.

That is completely non responsive to my question. I didn’t ask about any of that - I’ve stated repeatedly that people who used to work at Apple went to nuvia, but also pointed out that these were just a couple out of the 100 people who worked on Apple processors.

My question, which you didn’t answer, was about your claim of published performance numbers for nuvia’s chips. Where are those numbers?
 
That is completely non responsive to my question. I didn’t ask about any of that - I’ve stated repeatedly that people who used to work at Apple went to nuvia, but also pointed out that these were just a couple out of the 100 people who worked on Apple processors.

My question, which you didn’t answer, was about your claim of published performance numbers for nuvia’s chips. Where are those numbers?
It was Apple’s CPU design chief.
The specs are very similar to Apple’s M1. Google it.

Here, Anandtech did write up on it.
 
Last edited:
google it.
I did. I found nothing. I also found no claim that they even ever made a functioning chip. I’ve seen only press releases about their “goals.”

In any event, you are the one who made the claim that they had achieved something. Also, per forum rules, that’s not a valid answer:

Instructing other members to search. Instructing members to search themselves for an answer or responses such as ****** ("**************************") are experienced as rude and condescending. We don't have an issue with people linking to Google search results, although we prefer that members also link to a specific page that addresses the question being posed. A few words explaining how you got your search results makes your response even more helpful.
 
  • Disagree
Reactions: rp2011
I did. I found nothing. I also found no claim that they even ever made a functioning chip. I’ve seen only press releases about their “goals.”

In any event, you are the one who made the claim that they had achieved something. Also, per forum rules, that’s not a valid answer:

Instructing other members to search. Instructing members to search themselves for an answer or responses such as ****** ("**************************") are experienced as rude and condescending. We don't have an issue with people linking to Google search results, although we prefer that members also link to a specific page that addresses the question being posed. A few words explaining how you got your search results makes your response even more helpful.
 
Sure, that’s all that’s out there. They “target” a certain level of performance. But they never made a functional chip and never claimed to have actually achieved that performance. Which is what I said all along.

If you read the article, they are actually “predicting” performance by using OTHER Arm chips and running benchmarks on them. There is absolutely no information in this article that suggests that they had done any testing of anything they produced themselves, which is actually quite telling; in the CPU industry we can get a pretty decent estimate of a benchmark by simulation, assuming we actually had produced even a verilog model of our design.
 
It was Apple’s CPU design chief.
The specs are very similar to Apple’s M1. Google it.

Here, Anandtech did write up on it.
Since you edited your post after I responded, no, it was not Apple’s cpu design chief. It was their chief architect. That’s very different. https://www.anandtech.com/show/15115/nuvia-breaks-cover-new-startup-to-take-on-datacenter-cpu-market

Johny Srouji is apple’s CPU design chief.

A cpu architect doesn’t do design (unless they happen to also be a designer, e.g. at a startup, which Apple certainly is not).
 
I did. I found nothing.
Ah, the problem is you used Google. :cool:

Me: "Hey Siri, find me proof that backs up something that someone on MacRumors is claiming to be the case."
Siri: "Please be more specific."
Me: "Hey Siri, find me specs for chips that Nuvia has created"
Siri: "OK. Nestle's chocolate chips, per serving size of 14 grams, have 70 calories, 9 grams total carbohydrates, 3.9 grams total fat"
Me: "Hey Siri, stop!"

Actually, I do like Siri for what I use it for.
 
  • Haha
Reactions: cmaier
What do they do then? Genuinely asking.
They do architecture! :)

It varies depending on where you are, but typically that would be high level stuff (we should have X cores, each core should have Y ALUs, the branch prediction algorithm should be this, there should be a core for machine learning, and it should perform these functions, etc.). Pretty typically they may also write a model (in C++, verilog, or some other language) that allows simulation of the functionality, and which can be used to verify correct operation after the designers are done.

The designers take the model, and do the design. So, for example, if I am the designer for the integer ALUs, I may be given code that shows what functions it should perform and what the results should be. So, for example:

if instruction == .add {
Answer = A + B
}

I need to convert that into a hardware design. So I figure out what the adder will look like, where the transistors will go, how the wires will be done, etc. I may do a carry-look ahead adder, a Ling adder, or some other design. I will keep iterating on it to make it run with the required power budget and meeting the required timing constraints. I may look at the big picture and realize that it would be more efficient if some things my block is supposed to do were instead done somewhere else (e.g. the register file), in which case I go back to the architect and work with her to repartition the logic. I figure out where the pins on the edge of my block go, so they can tile to neighboring blocks. I figure out where the clock wires go, signal repeaters go, power and ground rails go, etc. Then I produce my own verilog (or whatever), that can be simulated, but it is written at the gate level and explicitly lists all the connections between gates. If run as a simulation, it operates a thousand times slower than the architectural model, because of all the details. So I run formal verification, which mathematically compares my design to the original model. Etc. etc.

Most of my career I was a designer. I worked on chip floor plans, circuit designs, logic designs, physical designs (where do the transistors and wires go), etc. Sometimes I did some instruction set architecture work (the original design of the x86-64 integer 64 bit instructions, for example). I rarely did architecture, other than at the beginning of opteron/athlon 64, where we all had to pitch in because we didn’t really have an architect.
 
You are out of your mind if you think no “completion” is good for anyone other than Apple. I hate to break it you you, Apple doesn’t care about you.
You are out of your mind if you think no “completion” is good for anyone other than Apple. I hate to break it you you, Tim Cook doesn’t care about you.

Fixed the typo.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.