Apple's Arm-Based Macs With Apple Silicon Chips Will Support Thunderbolt

Everything about Big Sur screams touch screen support. Fonts are bigger, there are settings to add more space in the menu bar, sliders every year, tons of padding between controls.

I thought they might eventually converge iOS (now iPadOS) with MacOS but the Apple bigwigs in interviews always denied it. Now that I've seen Big Sur, it makes me think they've had a change of heart. Apple is pushing the iPad to be a PC, but it won't really happen if the iPad always runs a light version of what's on MacOS. The second thing iPad needs to do, is support different aspect ratios and external monitors that extends (not mirrors) the desktop.
 
Frankly, though, if you're doing audio (or other pro work) I wouldn't plan on queueing up to be the first to get an ARM Mac, any more than you should rush to install the latest MacOS on release day, given that it usually takes a few months for all the plug-in and driver makers to even catch up with a new OS release...

Many audio interfaces have gone class-compliant which in theory should work with ARM Macs (they'll work with iPads and iPhones if they have their own power). The exceptions would be the UA Apollos and other audio I/Fs that require sophisticated control software. Apogee is on the right track with their Symphony Desktop (and interestingly also the Quartet) where they are configured right from the front of the interface itself.

As for software, stick with Logic Pro and you'll be fine for a DAW. If you use 3rd party plugins you'll likely have to wait. I own Pro Tools, Cubase etc as well, but I've been trying to use Logic because it really is the best value when you consider the free, timely updates and NO SUBSCRIPTION.
[automerge]1594354825[/automerge]
ARM in itself is still old, you can not deny that.

Wouldn't that be like saying: "the internal combustion engine is old, so we shouldn't use it"?
 
Last edited:
I mean I would like physical function keys back. You can't touch-type with a Touch Bar.
I turned my Touch Bar for the last few days to function keys, and I have the same experience I have when using an external keyboard. Main issue for me, is I don't use the function keys at all. I personally find the function key row too far to touch type too. Chatting to friends, and most agree that touch typing the numbers is hard enough, let along going all the way forward to the function row. Another point too, if you adjust your sound or brightness with those keys, you have to look anyways.
 
Best to prevent the computer industry from reverting back to that state.

Why? No, seriously, why are more architectures bad? There are other ways of ensuring cross platform support than just native binary support.

"and more ARM devices ship these days than x86" = Humans buy too many mobile devices.


Old man yells at cloud (pun intended)?

Mobile devices are pretty capable these days, for a lot of folks they're their primary computing device, in large portions of the world they dominate as the only computing device.
 
ARM in itself is still old, you can not deny that.

Not as old as x86 - an instruction set which has roots going back to the 8008 in 1972 - 'easy translation' of 8008/8080 code was part of the brief for the 8086 in 1978, and since then binary compatibility has been a thing. The original ARM is at least a decade younger and was designed from the ground up (with inspiration from 6502 and Berkeley RISC but not compatibility - anyway, an ARM 2 could software-emulate a 6502 faster than the real thing...) - and from what I can glean quickly, binary compatibility with the original ISA (not the same as modern ARM32) is very limited and hasn't been a priority.

Will it interface to PCIe or some other bus? And, and how many total PCIe lanes available and how fast? Have been following ARM devices and they usually have limited or no PCIe.

Yes, ARM can do PCIe (e.g. https://www.gigabyte.com/uk/Server-Motherboard/MP30-AR1-rev-11#ov) - it's just not in great demand on phones, tablets or cheap "maker" boards (like the Raspberry Pi) built from surplus set-top-box chipsets. Quite likely that the A12 has limited/no PCIe which may be why Apple can't just hang a Titan Ridge chip off it to get Thunderbolt. I'd guess that the "real" Apple Mac Silicon will have USB4/Thunderbolt 4 on-chip.

Bear in mind, though, that (a) only the Mac Pro has PCIe slots (and any Mac Pro/iMac Pro replacement is likely to be the last step of the 2 year transition), (b) it sounds like the first wave of ARM Macs are going to rely on Apple Silicon iGPUs and (c) the SSD controller is baked onto the Apple Silicon, so while the SSD may hang off something PCIe-like, those lines are going to be dedicated to Apple's proprietary SSDs. So it's not clear what PCIe would be 'for' in a MacBook or iMac - the real question there is whether the Thunderbolt/USB4 implementation will support eGPUs or PCIe enclosures.

Many audio interfaces have gone class-compliant which in theory should work with ARM Macs (they'll work with iPads and iPhones if they have their own power). The exceptions would be the UA Apollos and other audio I/Fs that require sophisticated control software.

"Class compliant" would already be on my "must have" list, given the annual opportunity for low-level drivers to be broken by the latest OS update. "Control software" doesn't have to be low-level - it could be implemented via a virtual MIDI interface, sending control voltages via audio channels, talking to an on-device webserver etc. Plenty of MIDI devices come with "control panel" software, but all that usually does is send CC and SysEx messages via the class-compliant MIDI interface - and should be perfectly fine under Rosetta. The real "big bang" here is the dropping of 32-bit support - which some people who have been holding off from Catalina are going to face at the same time as switching to ARM. Fortunately, I think Logic itself dropped 32-bit plug-in support some time ago.

Have to say, looking at the Apollo website, it's hard to see what is "drivers/control software" and what is just bundled audio effect plug-ins. Still - they have at least one USB version so they can get going on that with the dev kit :) For actual audio plug-ins it is somewhat more likely that they'll have hand-optimised x86 code in them - but frankly the way forward there should be to replace all that with system calls and let the OS do the optimising (especially since Apple-proprietary on-chip acceleration is likely to be a 'thing' with future ARM chips). You can do that (and work through all the other checklists for ARM compatibility) on an x86. The mantra should really be "welcome to 2020 - make it CPU independent" not "make it ARM dependent".

...for anybody with low-level drivers that really do give one-ier '1s', rounder '0s' and (more realistically) better latency, then there's probably not a lot of point testing it on anything but Apple's final USB4/TB implementation.

As for software, stick with Logic Pro and you'll be fine for a DAW. If you use 3rd party plugins you'll likely have to wait.

True, but I should imagine that it is pretty common for Logic users to have spent several times more on third-party instruments and effects than they did on Logic itself...
 
That doesn't work on any modern RISC processor. Once your CPU runs at 2.5 GHz, RISC or not, or as a desktop processor at 3.5 GHz, reading data from L1 cache is a 3 or 4 cycle operation already. On the other hand, the A12 can decode 7 instructions per cycle, and execute up to 10. There are plenty of instructions with long latencies.

I was talking about the "idea of RISC" not about how something is working today. So you would have to look back more than 30 years in order to understand what i was talking about.
Also even today, most functional units accepting 1 instruction per cycle if the are independent, even it the latency is longer. The key here is, that the units are pipelined. Indeed when looking back - one micro-architectural property of the first RISC processors was, that they had a fully pipelined architecture - contrary to the contemporary CISC designs.

In any case i was talking about what makes a RISC processor simply from the viewpoint of the ISA design principles and how back in the days this was an important paradigm shift from HW design point of view.

So x86 certainly is NOT a RISC CPU even if it uses on microcode level some RISC like principles - this was my whole point.
 
I love it. Use it all the time. Admittedly I have some awesome setup for apps with Better Touch Tool.

it's interesting that some people hate on this... just because it doesn't fit their needs or they can't be bothered to find out what it can do....

That said I would love for apple to have an OLED keyboard like the Optimus Maximus. Completely customisable for every app. shows tools images into photo shop for example.

It's not a neutral take it or leave it proposition though. They took something away to add it. I prefer the function keys.
 
Get rid of that nonsense and just go to usb 4. Streamline it even more.

Yes, that is what they are doing.


It probably is exactly that. USB4 supports Thunderbolt 3

As far as i know, it's the other way round, USB4 is thunderbolt 3 with support for USB.
[automerge]1594395981[/automerge]
Yes, but Intel is the licence holder, and could, in theory at least, have withheld the use of the licence to Apple.

I have read that USB-C *can* include Thunderbolt-3 without a licence, but that TB3 support is one of the optional items in the USB4 standard.

Intel gave away the license to USB for free as far as i know, and with USB4 they are trying to have very little optional things, they are making a lot of tech inside, mandatory for manufacturers. Which will make consumers life easy. Atleast thats the plan, we have to see how it pans out.
 
Have to say, looking at the Apollo website, it's hard to see what is "drivers/control software" and what is just bundled audio effect plug-ins. Still - they have at least one USB version so they can get going on that with the dev kit :)

My experience thus far with UA updates have not been great. It took a while to get themselves up to Catalina and I've seen it complain about incremental Catalina updates too. Their software is fairly sophisticated as far as audio interfaces go when you consider the virtual mixer, user account integration etc.

My guess is that they're not going to be ready for ARM Macs for a while, which is another reason why I stick with the Apogee/ Logic combo (plus I'm genuinely satisfied with them). This might be why the new Apogee Symphony Desktop interface can be controlled completely from the hardware, including the plugins as well. This is actually a smart move on their part because it relies less on software to function.

I feel sorry for Pro Tools users who I suspect will be waiting a LONG while for ARM support given Avid's history. If I was to guess, I'd bet it's going to take them a year or two to release support for ARM.
 
Last edited:
Many audio interfaces have gone class-compliant which in theory should work with ARM Macs (they'll work with iPads and iPhones if they have their own power). The exceptions would be the UA Apollos and other audio I/Fs that require sophisticated control software. Apogee is on the right track with their Symphony Desktop (and interestingly also the Quartet) where they are configured right from the front of the interface itself.

As for software, stick with Logic Pro and you'll be fine for a DAW. If you use 3rd party plugins you'll likely have to wait. I own Pro Tools, Cubase etc as well, but I've been trying to use Logic because it really is the best value when you consider the free, timely updates and NO SUBSCRIPTION.
[automerge]1594354825[/automerge]


Wouldn't that be like saying: "the internal combustion engine is old, so we shouldn't use it"?
When people use computers, they aren't always looking for the most low powered device.
Why? No, seriously, why are more architectures bad? There are other ways of ensuring cross platform support than just native binary support.




Old man yells at cloud (pun intended)?

Mobile devices are pretty capable these days, for a lot of folks they're their primary computing device, in large portions of the world they dominate as the only computing device.
One, do you want a return of the early computer industry with little to no standards. 2, what I mean is there are people who have grown so reliant on mobile devices that they will fight over wall sockets just to charge them. I even had a person unplug my MacBook Pro just to charge their phone (it was an old one bought second hand with out a battery so that made it really annoying).
 
One, do you want a return of the early computer industry with little to no standards.
So you’re suggesting that the only other alternative is to essentially mandate that there will only ever be x86_64 architecture desktop/laptop computers from now on out?

Name me 5 other computer companies who are realistically in a position to develop, sell, and support a new incompatible architecture, so that there’s an actual chance of this incompatible architect Armageddon you’re trying so hard to protect us from. I can’t higher than 2, and they don’t have the mindshare to get people to switch, so they won’t go down that path.

What it looks like to me is there most definitely won’t be a scene like “the early computer industry with little to no standards.” Instead, there’ll be the PC world, and Apple. And in 5 years, everything Apple sells will use the same family of processors. They’re going simpler and more consolidated, not fractured and more chaotic.
 
So you’re suggesting that the only other alternative is to essentially mandate that there will only ever be x86_64 architecture desktop/laptop computers from now on out?

Name me 5 other computer companies who are realistically in a position to develop, sell, and support a new incompatible architecture, so that there’s an actual chance of this incompatible architect Armageddon you’re trying so hard to protect us from. I can’t higher than 2, and they don’t have the mindshare to get people to switch, so they won’t go down that path.

What it looks like to me is there most definitely won’t be a scene like “the early computer industry with little to no standards.” Instead, there’ll be the PC world, and Apple. And in 5 years, everything Apple sells will use the same family of processors. They’re going simpler and more consolidated, not fractured and more chaotic.
Eh, at some point someone stupid enough to do might come along. The only reasons why Apple is doing it is to have uber tight controls over everything and not put actually good heat sinks into MacBooks (though I question why Apple decided to put Core series processors in MacBook Airs rather then something more on the average consumer level, like Pentiums).
 
When people use computers, they aren't always looking for the most low powered device.

One, do you want a return of the early computer industry with little to no standards. 2, what I mean is there are people who have grown so reliant on mobile devices that they will fight over wall sockets just to charge them. I even had a person unplug my MacBook Pro just to charge their phone (it was an old one bought second hand with out a battery so that made it really annoying).

1) "More architectures" != "No standards"
2) Of course people are reliant on their phones, it's their camera, messaging utility, web browser, actual phone, wallet, GPS/Map, etc all in one. The world today is digital, and for most people their phone is an essential piece of communication. And someone unplugging your device isnt a problem with phones, it's someone being a jerk, something, I assure you, existed before smart phones.
 
1) "More architectures" != "No standards"
2) Of course people are reliant on their phones, it's their camera, messaging utility, web browser, actual phone, wallet, GPS/Map, etc all in one. The world today is digital, and for most people their phone is an essential piece of communication. And someone unplugging your device isnt a problem with phones, it's someone being a jerk, something, I assure you, existed before smart phones.
I mean reliant in they act is if they can't live with out the phone.
 
I mean reliant in they act is if they can't live with out the phone.

When it's your primary means of everything from communication to payment to job searching to... etc of course you feel like that. Welcome to the modern world, we have T-Shirts (some with mobile phone brands on them). You can also stick an onion in your belt if you want to hark back to earlier times, and they still sell canes you can shake at young people on your lawn as they walk past talking on their newfangled smartphones.
 
When it's your primary means of everything from communication to payment to job searching to... etc of course you feel like that. Welcome to the modern world, we have T-Shirts (some with mobile phone brands on them). You can also stick an onion in your belt if you want to hark back to earlier times, and they still sell canes you can shake at young people on your lawn as they walk past talking on their newfangled smartphones.
I'm referring less to adults in that and more so to children and teenagers.
 
One, do you want a return of the early computer industry with little to no standards.

It's 2020 and we can - and do - have standards that don't depend on CPU architecture. In fact, we could have had those in the 1980s when microprocessors were just starting to get powerful enough to run grown-up operating systems like Unix that offered source-level compatibility in high-level languages and some measure of hardware abstraction - but then along came IBM, froze the PC world on 1970s tech and got the market hooked on binary compatibility.

The IBM PC wasn't a "standard" - it was just a. n. other CP/M-like system with a proprietary BIOS that grew to dominance because it had those three magic letters on the front (with an army of smart-suited salesdroids and an unhealthy dominance of the business equipment market behind it). Despite all the revisionist history about it being an open standard (sure, IBM would let others make expansions and software for it, which was "open" by IBM standards but already s.o.p. in the personal computing world) nobody else could make an IBM compatible machine without IBM's copyrighted firmware until some bright spark found a legal way to reverse-engineer it.

Funny how so few of the worthwhile standards - like the ones allowing us to network our computers, connect to the internet, send emails, browse the web, most of the high-level languages and crossplatform APIs - actually originated on the PC... but then when you're a proprietary monolith you don't need standards, you just hard-wire everything for the only machine that counts.

The idea that everything has to be locked to the CPU architecture really is a notion from the PC/Wintel world - Unix/Linux has always been crossplatform, Android runs on both ARM and x86 with most apps shipping as CPU-independent bytecode (...it's just nobody wants an x86 mobile) and Apple are gearing up for their 4th CPU change...
 
Apple‘s developer documentation indicates that dGPU’s are unlikely and that Apple are pretty excited about where they are going with graphics processing. If I were a betting man, I’d expect some pretty impressive integrated GPU’s on Apple Silicon.
Whether integrated into the SoC or a separate chip isn’t especially relevant. But yeah, expect Apple to use their own GPU, from the Mini and laptops to iMac level GPU requirements. For Mac Pro and iMac Pro, it wouldn’t surprise me if they continued with AMD for some time, post AS-ification of those models. More due to the overall low sales potential than anything else.
 
Last edited:
Personally it’s about having to pay the premium for a TouchBar to get the specs I need on the machine while having an inferior workflow.

I’m glad they added back the physical escape key but my model doesn’t have that. As a software engineer the Touch Bar provides zero acceleration to my workflow and is just a detriment.

Ive customized it quite a bit, but more so to reduce its negative impact. It doesn’t add anything but I’ve managed to make it less bothersome.
How much of a premium do you think you’re paying for the Touch Bar?

I’d be surprised if it were more than $5-10.
[automerge]1594434676[/automerge]
On that last part, possible to merge the Mac with Apple’s mobile products.
They’re not doing that. Bottom line, they’re replacing a lower-performing CPU and GPU with higher-performing parts. Macs will still run MacOS; mobile devices will still run iOS and iPadOS. imo 95% of Mac users won’t even know (or care).

Of course, some current features of Macs like Boot Camp and virtualizing x64 OSes and other x64 code won’t be supported at first, if ever.
 
Last edited:
Get rid of that nonsense and just go to usb 4. Streamline it even more.

USB 4 will be dated by time the new ARM Macs ship out late this year and 2021. I’m already testing USB-X which is basically fiber between 2 devices and can do 1Pbps. It also supports up to 16k resolution across 32 monitors and has many adapters which Apple will love.
 
I would hope so since everything is moving to USB-C .

everything except sd cards, external storage, audio devices and even display ports (although this is more understandable). i wonder how much use a usb type a port would get. perhaps a windows laptop manufacturer has released some data.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.
Back
Top