Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Maybe this is just my own brain, but it's interesting to me how the potential for special-purpose coprocessors/cores has never decreased, but seems to be much less visible than it once was.

I remember way back in the 68k days, for "normal use" benchmarks--which in the day was largely integer-based--an '030 and and '040 might rate similarly, but for floating point operations the FPU in the '040 made it exponentially faster. Without an FPU, decoding an MP3 took literally days; with, it could be played in real time.

Later, vector processing units made certain things vastly faster on chips that had them versus chips that didn't.

These days we still benefit immensely from things like hardware video codecs, but they're common enough and invisible enough that we just don't think much about it anymore. Of course your phone has an HEVC encoder, almost everything does now, it's just sort of there doing its thing invisibly now.

Which is why the incredibly low friction between Apple's chip team and their software group could make AS Macs more interesting than just Geekbench scores. Already we are (I assume) seeing the benefits of this with some of the flashy software tricks that an iPhone camera can do because of specialized processing capabilities of A-series chips, so it wouldn't be a stretch at all to think that a group at Apple developing MacOS or Mac software might say "If we had a core that could do [thing] 100x faster than it can be done by a general-purpose core, we could implement this awesome new feature." and have that result in an actual core that does [thing] within a couple of chip generations.

It's mostly hypothetical at this point, but I have to keep reminding myself that even though Apple Silicon turns in truly impressive Geekbench scores that doesn't actually capture a lot of the real potential of a completely vertical development stack like Apple is moving toward.
 
On wikipedia it says that Silicon has:

- covalent radius = 111 pm ~ 0.1 nm
- Van der Walls radius = 210 pm ~ 0.2 nm

The distance between two neighboring atoms must be twice that (since diameter = 2 * radius).

Isn't "5nm" the MOSFET channel length? How many atoms does it need at minimum?

The issue is not how many atoms it must have, but whether or not it can turned off so as to prevent current flow through the gate. (We aren’t talking about MOSFETs anymore - mosfets haven’t been used in awhile).

That’s why we now use FINFETS, which use 3D structures that help shut the channel off more reliably, and why we are going to new 3D structures that improve things still further (e.g. the already-announced TSMC MBCFET.)

As long as schroedinger’s equations tell us that the quantum well is big enough to keep the electron from tunneling through the box formed by the gate, we’re fine. And that’s a function not just of the size, but of the electrical field.
 
You would be surprised about how many developers use Macs and use "non-traditional" VM tools such as Docker, Valet, Homestead and other container OSes. Software like MAMP and WAMP are things of the past; with so many advantages by using containerization based development.

This is going to be a huge struggle for developers if they are unable to efficiently run Linux micro OS's without passing it through rosetta stone v2. That will increase the costs for hardware and decrease the productivity of the developers.

For example, with the latest version of Magento 2.4+ you cannot even host the software on a Mac or Windows desktop... but instead REQUIRES a VM or dedicated box.

Almost none of that is x86-specific. You can just run an ARM container instead.
 
Calling Apple's chip developers "amateurs" proves that you are _not_ anyone to be taken serious.

And if you call playing video games as "serious productivity", I can only feel sorry for your parents.


BE NICE PLEASE. PEACE. we all have our own views. respect them
 
Both of these folks come from a marketing background. There is no valuable insight given.

At least provide Anand to interview so we can hear something meaningful.
It’s the day before the big reveal. They conveyed as much information as they wanted to. It wasn’t a matter of not having the right people to interview. Check back for more details next week.
 
So the Apple Silicon Macs will be based on the A series?

The DTK uses an A-series processor.

Apple could theoretically name their future processors anything they want, but I think it’s been confirmed that they will at least use the same ISA as the A series (which I think is just the ARM standard) no matter what they are called.
 
  • Like
Reactions: SuperCachetes
this could be the last GREAT ARM chip
I mean how small can they make a processor?
Seems like 5mm is the limit
next chip 2.5mm? I think not
would have to start piggy backing 2 5mm chips together at this point

I would never buy a professional MAC with an ARM chip

Just think INTEL and AMD is much better at this game of chip design for heavy intensive computing.

It's "nm" for nanometer, and it is not the end-all, be-all that determines whether a chip is awesome or not. In general 5nm is a fantastic milestone that's worth crowing about, if you're Apple or TSMC - but it's not the one single factor that makes the A-series silicon great.

And by heavy intensive computing I'm not sure if you mean "playing Overwatch" or something more serious - but be aware that Intel or AMD don't control the hardware or the operating systems their machines use. Apple and their SoCs are just barely getting started, and they have set themselves up very well in terms of vertical integration and control.

Bonus unsolicited education for you - check Srouji's resume. He worked for one of the "better chip design" companies and an outfit famous for "heavy intensive computing" before coming to Apple. ;)
 
  • Like
Reactions: DeepIn2U
“We spend a lot of time working with the product teams and software teams, and the architecture group really does sit in the center of that,” said Millet. [...]

Seems like we’ve found the bottleneck. Why ISN’T the architecture team at the center of the group?! They alone would further push the software team to really innovate and flex what they’ll have! Moreover they’ve help the marketing team finally “get it” and they’ll have to actuall work for a living and advertise proper benefits - not ignoring hardware features that we’ve seen so many times fall away: MagSafe, 3D Touch (sorry Haptic Touch fans this still was much better).
 
  • Love
Reactions: amartinez1660
So the Apple Silicon Macs will be based on the A series?

"Based on the design" would be more accurate I think. Apple was clear that Macs will have their own chips, but obviously, elements of the design, such as the CPU / GPU cores will be common across different device types. The Mac SoC will (hopefully) have more cores than the iPad, remove features specific to phones/tablets (such as motion sensors), and add features that are relevant to running a full OS.
 
  • Like
Reactions: amartinez1660
Almost none of that is x86-specific. You can just run an ARM container instead.


ARM was originally designed to be an energy efficient chip from the start. NOT a server multi core hi GHZ chip

You will see APPLE will get left behind. everyone will go X86 and windows 10

Besides. it is so easy to build a HACKintosh computer much cheaper and more powerful than APPLES 10k Mac Pro
 
You would be surprised about how many developers use Macs and use "non-traditional" VM tools such as Docker, Valet, Homestead and other container OSes. Software like MAMP and WAMP are things of the past; with so many advantages by using containerization based development.

This is going to be a huge struggle for developers if they are unable to efficiently run Linux micro OS's without passing it through rosetta stone v2. That will increase the costs for hardware and decrease the productivity of the developers.

For example, with the latest version of Magento 2.4+ you cannot even host the software on a Mac or Windows desktop... but instead REQUIRES a VM or dedicated box.

Very interesting, I was not aware so many developers used VM tools. I hope for the developer's sake that some sort of solution comes around. It would be unfortunate to require two computers to complete a single task if that were even a viable option.
 
  • Like
Reactions: amartinez1660
Well, I’m a pro gamer and an aspiring Twitch variety streamer, including playing Overwatch and DJing EDM. Apple chips are whack compared to Kirin and AMD. You ain’t going to outspec my gaming rig.

Apple’s XDR crap monitor can’t even output 144hz, that makes speedrunning games impossible. So much for “pro”monitor.

I’ll bet you were singing the praises of Intel chips in your gaming rig 12yrs ago, and back in 1998 Singh mg the praises of AMD’s 1Ghz Athlon chip then too if you’re old enough (the year I may get wrong but for 2-3yrs Athlon bested Penguin 4’s before Intel came out on top).

we’ll see what the future holds for gaming on the various platforms for desktop computing - if it still remains beyond the foreseeable future.

regarding Kirin — what is your baseline performance evaluation for comparison vs A series chips?
- billions of transistor count?
- how efficient the entire phone can run on a full battery - comparing battery capacity?
- what games can run - because this isn’t a good litmus or scientifically sound comparison as not all games run on either platform.
- uptime after a cold start? Again not a good comparison due to the OS and Apps can drastically affect this beyond th Rene users control.
- benchmarks? Find one that runs equally evenly as designed ok each platform for modern mobile phones.
 
I am waiting for a Mac mini Pro with Apple Silicon. The mini should be one of the first to get Apple's new chip.
 
ARM was originally designed to be an energy efficient chip from the start. NOT a server multi core hi GHZ chip

You will see APPLE will get left behind. everyone will go X86 and windows 10

Besides. it is so easy to build a HACKintosh computer much cheaper and more powerful than APPLES 10k Mac Pro

Building a "hackintosh" is illegal. Apple doesn't care about individuals doing it, as long as those individuals don't make the claim that it is legal. One tiny company (Psystar) tried to turn it into a business; they were eventually ordered by the court to pay $3,000 per computer shipped for DMCA violations. (Not that Apple ever saw any of that money, because they went totally bankrupt). Fact is, you can't run MacOS without a DMCA violation.

It doesn't matter what ARM was originally designed to be. x86 was originally designed to be an 8/16 bit processor using no more than 640KB of memory running at a cool 4.7 MHz. (Sorry, it was 1088 Kilobytes). ARM was a powerful 32 bit processor from the start. As far as the "server multi core" is concerned, there are ARM processors with 128 cores in actual use in servers, you can rent ARM server instances from Amazon / AWS, for example.

BE NICE PLEASE. PEACE. we all have our own views. respect them
There are views that I'm not going to accept.
 
One time when you were working, did you get zapped by a laser beam and were digitized and then transported into the computer?
Attirex, that's unlikely. However, cmaier _did_ indeed design CPUs for AMD and for other companies before that.

(Sorry, originally wrote ARM by mistake).
 
Last edited:
Almost none of that is x86-specific. You can just run an ARM container instead.

Actually, that's incorrect. Docker on MacOS has a VM that it instantiates. You're not running your container on MacOS, you're running your container inside that VM. That's true for many other toolsets; behind the scenes they're instantiating a Virtual Box/VMWare-based VM.

It's unfortunate, but true. Heck, you can't run an ARM container on MacOS/x86; it doesn't work that way. You'd have to get an ARM emulator and run docker-on-ARM inside it (is there docker-on-ARM?).
 
Well, I’m a pro gamer and an aspiring Twitch variety streamer, including playing Overwatch and DJing EDM. Apple chips are whack compared to Kirin and AMD. You ain’t going to outspec my gaming rig.

Apple’s XDR crap monitor can’t even output 144hz, that makes speedrunning games impossible. So much for “pro”monitor.

That's like saying "my Lamborghini beats the pants off your electric SUV on the racetrack!". They are different devices for different uses. Apple Silicon is all about efficiency and performance/Watt - making powerful computers that stay cool and mobile devices that have long battery lives. They are not competing in the high-end desktop gaming space...at least not yet!
 
I'm not sure how you did your calculation that 3.5Ghz takes three times the energy as 2.0 Ghz, and that might be true of x86, but I don't think we have enough information about Apple ARM architecture to make a calculation like that. Apple Laptops will have at least 10 hours of use, and they will adjust power consumption to be able to make those claims.
That's general knowledge. Everything else being equal, power consumption grows with the square of the clock speed. The square of (3.5 / 2) is 3.0625. That's only true in a reasonable range: It takes into account that you need higher voltage to run at higher clock speed, but at some speed your chip will start melting. And at the lower clock rate, you can't reduce the voltage more and more because at some point the chip would stop working. But for 2.0 vs. 3.5 the formula will be fine.
 
  • Like
Reactions: amartinez1660
Actually, that's incorrect. Docker on MacOS has a VM that it instantiates. You're not running your container on MacOS, you're running your container inside that VM. That's true for many other toolsets; behind the scenes they're instantiating a Virtual Box/VMWare-based VM.

It's unfortunate, but true. Heck, you can't run an ARM container on MacOS/x86; it doesn't work that way. You'd have to get an ARM emulator and run docker-on-ARM inside it (is there docker-on-ARM?).

Compiling Docker for ARM shouldn't be too bad; at most a couple of weeks of work.

But docker is just the shell... the bigger issue is the OS and components; all of which might require work to recompile to ARM. Considering the number of tools required be migrated from database engines, caching engines, etc... it really can become a mess. Chances are there will be dependancies that will NEVER be updated and workarounds will need to be found to replace them.

For a developer just wanting to work on websites like they used to, it might be a while before it is plug and play again.
 
  • Like
Reactions: amartinez1660
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.