Intel Says 10nm Chip Development is On Track

Discussion in 'MacRumors.com News Discussion' started by MacRumors, Oct 22, 2018.

  1. cmaier macrumors G4

    Joined:
    Jul 25, 2007
    Location:
    California
    #176
    This post makes no sense. The physics problems that arise at smaller feature size (leakage, DFM issues, electro migration, IR drop, etc.) effect all instruction set architectures equally. The electrons and holes don’t care if you are RISC or CISC. Maxwell’s equations still apply.

    Having designed powerpc, sparc, mips, and AMD64 cpus, I’ve not once found that one architecture works at a process node but another doesn’t. The only differences are yield. Some arch’s require more transistors than other.
     
  2. StralyanPithecus, Oct 23, 2018
    Last edited: Oct 23, 2018

    StralyanPithecus macrumors member

    StralyanPithecus

    Joined:
    Sep 27, 2018
    #177

    If you help designing cpus, then you know that what makes the logic work is how those connections between transistors, registers, ALU's and CU's are build, the more complex the connections the more difficult to shrink are, and the more problems with cross talking you have, even between metal layers, not only the silicon substrate. I already explained this in a previous post.

    Another point which I've already mentioned, is the old baggage the x86 architecture carries with, in detriment of the efficiency, something that ARM doesn't carry. x86 cpus need to be backwards compatible with software and OSes from the 80's or even before that. And of course is the VT-x (VM emulation) included in the x86-64 CPU's that ARM doesn't have.

    Let me add to finish this, that I'm not an Intel zealot, I know Intel chips are no the most efficient here, they are the big truck, but their market customers ask them to be backwards compatible and be a little Jack of all trades. I will love to see a Mac running an in home Apple CPU, it will help most apps in running better, as for the more complex software, that's a different issue, and it will need to be rebuild almost from ground to be able to make full use of the ARM advantages.
     
  3. im_to_hyper macrumors 65816

    im_to_hyper

    Joined:
    Aug 25, 2004
    Location:
    Glendale, California, USA
    #178
    How do you overcome the extremely cold temperatures needed for effective quantum computing?

    A liquid nitrogen or liquid helium setup at home just seems... dangerous and inefficient.

     
  4. StralyanPithecus macrumors member

    StralyanPithecus

    Joined:
    Sep 27, 2018
    #179
    Quantum computing is still in its infancy, right now has not use for home and even for business. It's the long term future, but in the middle time we need to explore another technologies like carbon nanotubes, and we already using the parallel processing or multicore approach.
     
  5. uberzephyr macrumors member

    uberzephyr

    Joined:
    Feb 18, 2003
    #180
    No one said anything about better, the benchmark in question was complexity.
     
  6. cmaier macrumors G4

    Joined:
    Jul 25, 2007
    Location:
    California
    #181
    I designed many CPUs. Exponential x704, AMD K6+, Opteron, Athlon 64, UltraSparc V, etc. I don't know what you are trying to say. Are you talking about interconnect coupling? If so, what does that have to do with anything? There is just as much crosstalk in a powerpc as in an x86. The interconnect graph is equally complicated, and Cadence routes them using the same algorithms. In fact, the reason the first x704 takeout didn't operate at full speed was due to a coupling issue. And coupling is taken care of by adjusting interconnect pitch, wire swizzling, and adding ground/power planes. The same techniques are used regardless of instruction set architecture.

    If you are saying its somehow more difficult to shrink an x86 than an ARM chip, that's nonsense. Internally, they are pretty much the same. The main difference is that an x86 has a much more complicated instruction decoder. But the execution units, caches, floating point units, scheduler, register renaming, TLBs, etc. all look almost identical. I helped shrink many x86 chips. I never once thought "gee, this would be so much easier to shrink if this were a RISC chip." It's the same.

    Ok. Not relevant to anything that was said, but ok. Note that x86-64 gets rid of a lot of that cruft, so that compatibility is achieved by microcode, instead, and there isn't a lot of random hardware in the design to support the old instruction sets. As I said, the instruction decoder (including the microcode ROMs) is the main difference. It adds about 20% to the area of a single core die. Of course the penalty is less when you factor in cache, I/Os, etc.
     
  7. ArtOfWarfare macrumors G3

    ArtOfWarfare

    Joined:
    Nov 26, 2007
    #182
    Directly, no. But I think as more developers find they're using ARM CPUs on the desktop, they'll increasingly include ARM build targets for *nix.

    Already on my Pi, I'm finding it's uncommon for me to not find what I want for ARM.
     
  8. jdiamond macrumors 6502

    Joined:
    Dec 17, 2008
    #183
    Who cares about the etch size? When will we finally see all the Skylake features we were promised, like AVX-512, or modern video standards?
     
  9. StralyanPithecus, Oct 23, 2018
    Last edited: Oct 23, 2018

    StralyanPithecus macrumors member

    StralyanPithecus

    Joined:
    Sep 27, 2018
    #184
    I'm not a designer, so you have the upper hand here, I just used to be part of the building process (diffusion area and the quality control of all the process before sort). But I used to work very close with final design teams. At Intel a person doesn't design a CPU, it takes several teams with hundreds of people across the globe and a few good years work. My job was be part of the team that builds the physical CPU design and solves all the problems in the process. So a deep knowledge of a CPU structure is a must.

    By the way there's knowledge that is impossible to learn outside the Fab, in the University or such, due the nature of the information being considered commercial protected and under NDA's. Intel used to have the 3 shock months for the new engineers coming to work. We used to say "forget about everything you learnt, here is a new world"
     
  10. cmaier macrumors G4

    Joined:
    Jul 25, 2007
    Location:
    California
    #185
    We never hired folks from Intel because they worked on teams of hundreds and therefore had no understanding of the overall design. We'd ask them questions at interviews and all they knew about was how to design the adder circuit they'd been responsible for for 10 years. At AMD, for example, there were about 15-20 main people who designed x86-64 and the first chip that used it, plus some folks who did things like design verification. The physical designers did their own logic design, floor planning, place and route, etc. And, for example, rather than owning, say, an adder, I would own logic and physical design for all the integer and floating point execution units, or the integer units and the scheduler, etc. Same thing at Sun. At Exponential, when I arrived, I took over responsibility for half the chip (to be fair, it was a shrink-type situation by that point). We worked closely with the fab people, but we drove what they were doing. And the drivers for process technology didn't change one iota depending on whether we were designing an x86 or a RISC.
     
  11. GeneralChang macrumors 65816

    Joined:
    Dec 2, 2013
    #186
    Both of these are factual statements! As the proud owner of a PC from 2007 (the GPU is actually from 2012, because driver support) that's running Windows 10 and a PC from 1995 (not quite 89, unfortunately, that uh... would be almost as old as me) that's running a fairly recent low-power distro of Linux, I can confirm both statements.

    I can also confirm that neither of those user experiences are very good. In fact, I'd go so far as to say they're both an unusable pile of crap with those OS's on them.

    If your hardware is more than 7 or 8 years older than your OS, particularly if you haven't made upgrades to stuff like your storage speed or RAM (I can't even tell you how much I hate Win10 without an SSD), I feel comfortable saying you are going to have a problem, regardless of whether the installation actually completes. So, actually, I feel like Apple has the right idea in this regard.
     
  12. StralyanPithecus macrumors member

    StralyanPithecus

    Joined:
    Sep 27, 2018
    #187
    Just a small note...You know that a lot of expats from Intel ended working at AMD in the 90's during the Pentium crisis?
     
  13. Unregistered 4U macrumors 6502

    Joined:
    Jul 22, 2002
    #188
    I think this is one of the more important points that should be remembered/considered. A good portion of an Intel CPU die is focused on backwards compatibility and getting all the possible variety of commands converted to micro ops. Consider what performance we’d be seeing from Intel if they had been able to jettison the cruft?

    Question: Does Intel ship any mass produced processor that ONLY supports 64-bit instructions? If I remember correctly, Apple was able to free up a chunk of processor real estate by removing 32-bit compatibility.
     
  14. cmaier macrumors G4

    Joined:
    Jul 25, 2007
    Location:
    California
    #189
    I worked at AMD (microprocessor design) from 1997 onward. Are you referring to the Austin team? If so, I'll note that that team failed utterly, and AMD bought Nexgen to actually have a decent K6. (I went to work for the Nexgen group before it was integrated into AMD). The nexgen group (with some DEC Alpha folks) also did the x86-64 work. I'm not aware of any Intel people in the california microprocessor design division in the decade I was there.
    --- Post Merged, Oct 23, 2018 ---
    It takes 20% of the core die area to support the complex instruction decoder. On a multi-core die, taking into account cache area, I/Os, etc., the penalty is much smaller.

    Still, would be better to not need it at all.
     
  15. StralyanPithecus macrumors member

    StralyanPithecus

    Joined:
    Sep 27, 2018
    #190
    Not commercially but in my days we had some projects working in completaly entire different architecture more closer to ARM... I left long time ago, so I really don't know, or I can't say without compromising people...
     
  16. FriendlyMackle, Oct 23, 2018
    Last edited: Oct 23, 2018

    FriendlyMackle macrumors 6502

    FriendlyMackle

    Joined:
    Oct 29, 2011
    Location:
    NYC
    #191
    Apple's total chip orders may be "small," as you say, but Apple has a sterling reputation and leads the industry in terms of design and desirability. In other words, it is a prestigious brand, and losing Apple will not be good for Intel. It will be a blow to Intel's brand to lose Apple's non-iPad computer products if the ARM chips Apple uses get favorable reviews. That's if the rumors are true about Apple planning to ditch Intel for their own proprietary designs. Which I think this will happen, but that it will be a few years down the road. In the meantime...Apple can let Intel worry about what their plans may be.

    I'm not an engineer, so I can't comment on the veracity of your statement about x64 architecture and under 12nm chips. I have read that Intel and AMD's chips are much more densely populated with transistors than ARM chips (by like 50%). Is that true?
    But, I have also read that a good amount of what is in the x64 code is ancient and should be scrapped...leading to unnecessarily complex instruction sets and builds. Again, I don't know if this is accurate.

    I only know that as a consumer I would like to see more competition in this space!

    **EDITED TO ADD: Also, I think that another way in which Intel would suffer by Apple taking the route of using its own processors would be the PC industry following suit (if/when they see market acceptance of Apple's products). After all, ARM chips cost a lot less than Intel. There's a huge market of low-to-mid-level PCs which could ostensibly use ARM chips and offer consumers lower prices and/or better profits for the PC makers.
     
  17. cmaier macrumors G4

    Joined:
    Jul 25, 2007
    Location:
    California
    #192
    Not all ARM chips. There have been A-series chips that are comparatively low transistor density, but that's presumably because they are using the EVSX design style with 1-of-n logic. (This is a different way of designing circuits that has certain speed advantages in certain situations). Not clear if they are still doing that. But most ARM chips have similar active area density to most x86 chips.
     
  18. StralyanPithecus macrumors member

    StralyanPithecus

    Joined:
    Sep 27, 2018
    #193
    I will not follow you into the Intel - AMD wars, I left Intel long time ago ;)....Just good luck with the AMD-Intel partnership
     
  19. cmaier macrumors G4

    Joined:
    Jul 25, 2007
    Location:
    California
    #194
    I'm out of there. I retired. Now I'm a lawyer.
     
  20. StralyanPithecus macrumors member

    StralyanPithecus

    Joined:
    Sep 27, 2018
    #195
    I'm working as IT manager, sys admin and everything computer related guy at a small company, but dedicating most of my time to hunting subatomic particles and trying to make sense of the weirdness of the quantum world...
     
  21. FriendlyMackle macrumors 6502

    FriendlyMackle

    Joined:
    Oct 29, 2011
    Location:
    NYC
    #196
    What?!: "the sensibility of the quantum bits to external influences (a simple fart can kill the system)"
    LOL...please tell me you substituted 'fart' for something more technical?
     
  22. StralyanPithecus macrumors member

    StralyanPithecus

    Joined:
    Sep 27, 2018
    #197
    Ha ha, sorry for that!. For you something more technical: Even the cosmic radiation or microwaves (the one used to heat your food) can break the entanglement between particles, and that's a humongous problem right now. A quantum CPU is build using particles pairs in entangled configuration...
     
  23. FriendlyMackle macrumors 6502

    FriendlyMackle

    Joined:
    Oct 29, 2011
    Location:
    NYC
    #198
    All I can say is 'whew!' Yes, I understand (vaguely) that quantum physics are completely unlike the classical kind which we seemingly experience at our level of life (the non-sub-atomic kind). And, not entirely joking here: So what about thoughts...will observation affect quantum CPU functionality? ie, how far away are we from practical application in consumer-facing devices?
    --- Post Merged, Oct 23, 2018 ---
    Another question for you, about virtualization. Since ARM chips don't have this (right?) — would this be a disadvantage for most consumers using laptops and desktops with general non-scientific software? And, why can't ARM and/or Apple design virtualization methods without infringing on the Intel patents? Isn't this done all the time in other areas (and including phone and computer manufacturing). There is usually more than one way to achieve the same end.
     
  24. cmaier macrumors G4

    Joined:
    Jul 25, 2007
    Location:
    California
    #199
    ARM supports virtualization.
     
  25. StralyanPithecus, Oct 23, 2018
    Last edited: Oct 23, 2018

    StralyanPithecus macrumors member

    StralyanPithecus

    Joined:
    Sep 27, 2018
    #200
    First sorry if I implied working at quantum computers, I'm not. I just follow their latest developments. I more into the theoretical part of quantum physics.

    Here's is a small and simplified way how quantum computers work:

    Let's assume that you are looking for a friend in a city next to you, but you don't know the address so you need to start asking house by house if your friend lives there. The classic way computers works is asking if John Doe lives at the first house you went, and if the answer is no, you move to the next house and ask again. On a classical PC you can run several parallel searches, but you are constrict to the PC capability, wherever is 100 parallel searches or 1000. In quantum computers you ask all the city houses at the same time and you get the answer in just one operation. But for that you need to use a mathematical algorithm to read the quantum computer answer, because in quantum physics measuring something changes the result or quantum state. For that you need a classical computer.

    So in brief, so far not very useful for home users. And for now you will always need a classical computer working together with the quantum computer.

    As for the virtualization cmaier already answered you , and surely he knows more than me in that area.
     

Share This Page