Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
For now, yes.

Just like there were people carrying around USB floppy drives in 1998 and external DVD drives in 2008. That didn't make it a good reason to bring back those ancient technologies then, and this doesn't now.

The industry needs to move on, and bringing back legacy ports will just make the transition longer and more painful.

HDMI will disappear from conference rooms sooner or later, and end up where it belongs, alongside VGA and DVI.
Wrong analogy. It's not going anywhere while macOS continues messing up with external display compatibility –after the last Big Sur update, USBC doesn't work on my 4K monitor and after hours of frustrating troubleshooting I had to default back to HDMI. I've heard the same happening from several colleagues. Also, corporate offices are notoriously slow to move on from legacy systems. So yeah, HDMI ain't going anywhere for now.
 
Has anyone looked closely at the schematics to find some specs? Do we have HDMI 2.1? does the SD slot support UHS-ii or UHS-iii?
 
But all your'e showing is the disconnect between use cases and why replacing HDMI isn't happening anytime soon

TV makers aren't making displays with exclusivity purposes. they're making general purpose that have to support a wide range of connectivity. With the standard on billions of devices already being HDMI. HDMI is capable of the full range of supported technologies. from the most feature rich HDR 8k content, to simple presentations with almost no dynamic content.

replacing it with USB-C would be a lateral movement that has no real benefit to the TV technologies. I Think every tv having USB input in addition to HDMI would be a good step. But the idea that HDMI is a legacy port and needs to go away is in complete opposition to the billions, if not trillions of devices actively using it. the ubiquity of it's availbility, and it's technological support.
You've now moved the discussion from conference rooms and your thought that wireless is tied into a corporate network, and that somehow wireless doesn't work well to projectors, and back onto TV's.

If you want evidence. More and more laptops do not include hdmi. There's no reason to.
 
  • Like
Reactions: Arctic Moose
No reason you can't have a bus-powered USB4 hub.
Except they don't exist, and even the powered ones, are expensive. So, your solution is to buy something that doesn't exist, as opposed to something that is ubiquitous and cheap.

Great talk.

But I get it - people needing external hub/docks adapters is perfectly fine as long as it happens to other people.
Not at all. A big part of the benefit of USB-C/TB3/USB4/TB4 ports is the ability plug in adapters. Before an eGPU, I used a TB3 to Dual DP adapter for 2 4K displays. Imagine that. Two, 4K displays, from one tiny port.

For a while I had the laptop setup as a second machine on the desk, next to the mini, and again, a DP display (using a USB-C to DP cable this time).

I'm not against using hubs or docks or adapters. I'm against adding single use ports that are solved by cheap ubiquitous adapters or cables, at the expense of host-side ports, and being told "just use that non existent hub that's guaranteed to cost north of $200 even if it ever exists oh and good luck if it's ever available in your country".

USB-C to HDMI adapters are ****ing everywhere, thanks in part to the prevalence of USB-C on portable devices.

which don't need to share any resources - into a single port is a bloody stupid idea unless you're making a phone that only has space for one port

or unless you wanted to.. I dunno... support a plethora of downstream devices.

What's your suggestion? They use HDMI and USB type-A ports? What about DisplayPort monitors? Ok what if they use Mini DisplayPort again? Well you still need an adapter for HDMI. One of each? Ok great so this "pro" laptop is limited to one real monitor, and a ****ing tv. What if you want that external PCIe card? Well **** you USB type-A doesn't support that.

Maybe just maybe the stupid thing here isn't the most flexible port the computer industry has ever offered, but the people who insist on using a 20 year old port because their printer still has that type of plug.



because the M1 only supports a single display over Thunderbolt

Besides circular logic of "well the M1 mini does it" what's your logic for this? The behaviour was exactly the same on the intel mini before the M1.

as people keep explaining to you the hardwired HDMI is there on the Mini because the M1 has a hardwired output to drive a laptop display.
Do you know what technology practically every laptop display is connected to the GPU via? eDP. Embedded DisplayPort.

Your theory is, that eDP output can be converted to HDMI just fine and dandy, but it can't be routed to a TB controller?

and #4 has to be HDMI then, frankly, that's a non-issue - most 4k displays have HDMI in, anyhow.
So, again: we're back to losing functionality for very slight convenience and it's "oh it doesn't matter". Doesn't matter to you maybe.

You think it is remotely plausible that Apple are going to put 2 TB controllers in the entry-level M1 chip and then only put one in the "pro" M1X/M2 chip?

Before this week would you have thought it remotely plausible that Apple would use a magnetically attached power code on a desktop computer, with ethernet in the charging brick?

My point there was not actually that the future models would go back to a 2:1 arrangement, it was that you consider some aspects of apple device history 'conveniently not relevant', while other aspects are a core part of your argument.

the only real question is whether the third port is (a) just a USB port, as per the M1 iMacs (b) sharing a TB controller with port 2 or (c) has a third TB controller. I'm gonna guess (a), but (b) or (c) would solve your display problem...
(a) would raise some serious questions about wtf is going on in Cupertino, when the bottom of the line (ok bottom but 1) All-in-One desktop has more I/O than their top of the line pro laptop.

(b) raises similar questions, if they didn't learn the mistake with the earlier 4-port 13" that had mis-matched port performance

(c) is the least bad scenario if the description in the OP is correct, that doesn't mean it solves all the problems presented.
 
A lot of people here arguing about losing 1 USB-C port. That’s not what you’re losing. You’re losing a thunderbolt port. That’s a MUCH bigger deal for people that rely on TB (all of my external SSDs are thunderbolt)
As @theluggage pointed out, if the recent iMac announcement is anything to go by there's no guarantee those three ports are all TB3 capable either.
 
A lot of people here arguing about losing 1 USB-C port. That’s not what you’re losing. You’re losing a thunderbolt port. That’s a MUCH bigger deal for people that rely on TB (all of my external SSDs are thunderbolt)
I can't agree more
People are ready to sacrifice thunderbolt port over SD slot, which is an outdated technology no professional use anymore (hobby photograph are not professionals)

HDMI makes a bit more sense, but thunderbolt is so powerful.
This is simply to hide the fact that M1 is not able to handle more than two Thunderbolt ports
 
  • Disagree
Reactions: Rashy
Except they don't exist, and even the powered ones, are expensive. So, your solution is to buy something that doesn't exist, as opposed to something that is ubiquitous and cheap.

Great talk.


Not at all. A big part of the benefit of USB-C/TB3/USB4/TB4 ports is the ability plug in adapters. Before an eGPU, I used a TB3 to Dual DP adapter for 2 4K displays. Imagine that. Two, 4K displays, from one tiny port.

For a while I had the laptop setup as a second machine on the desk, next to the mini, and again, a DP display (using a USB-C to DP cable this time).

I'm not against using hubs or docks or adapters. I'm against adding single use ports that are solved by cheap ubiquitous adapters or cables, at the expense of host-side ports, and being told "just use that non existent hub that's guaranteed to cost north of $200 even if it ever exists oh and good luck if it's ever available in your country".

USB-C to HDMI adapters are ****ing everywhere, thanks in part to the prevalence of USB-C on portable devices.



or unless you wanted to.. I dunno... support a plethora of downstream devices.

What's your suggestion? They use HDMI and USB type-A ports? What about DisplayPort monitors? Ok what if they use Mini DisplayPort again? Well you still need an adapter for HDMI. One of each? Ok great so this "pro" laptop is limited to one real monitor, and a ****ing tv. What if you want that external PCIe card? Well **** you USB type-A doesn't support that.

Maybe just maybe the stupid thing here isn't the most flexible port the computer industry has ever offered, but the people who insist on using a 20 year old port because their printer still has that type of plug.





Besides circular logic of "well the M1 mini does it" what's your logic for this? The behaviour was exactly the same on the intel mini before the M1.


Do you know what technology practically every laptop display is connected to the GPU via? eDP. Embedded DisplayPort.

Your theory is, that eDP output can be converted to HDMI just fine and dandy, but it can't be routed to a TB controller?


So, again: we're back to losing functionality for very slight convenience and it's "oh it doesn't matter". Doesn't matter to you maybe.



Before this week would you have thought it remotely plausible that Apple would use a magnetically attached power code on a desktop computer, with ethernet in the charging brick?

My point there was not actually that the future models would go back to a 2:1 arrangement, it was that you consider some aspects of apple device history 'conveniently not relevant', while other aspects are a core part of your argument.


(a) would raise some serious questions about wtf is going on in Cupertino, when the bottom of the line (ok bottom but 1) All-in-One desktop has more I/O than their top of the line pro laptop.

(b) raises similar questions, if they didn't learn the mistake with the earlier 4-port 13" that had mis-matched port performance

(c) is the least bad scenario if the description in the OP is correct, that doesn't mean it solves all the problems presented.
While I do think your posts could do without all the stars (censored words), I absolutely agree with you on the actual content. As I said earlier in this thread, it's beyond me how anyone would cheer for such a move.
 
  • Like
Reactions: Arctic Moose
While I do think your posts could do without all the stars (censored words), I absolutely agree with you on the actual content. As I said earlier in this thread, it's beyond me how anyone would cheer for such a move.
Be thankful the forum software does censoring. I'm not one to mince words, and I'm not afraid to call a spade a ****ing spade.

Edit: I am Australian if that wasn't obvious already.
 
  • Disagree
Reactions: pdr733
A lot of people here arguing about losing 1 USB-C port. That’s not what you’re losing. You’re losing a thunderbolt port. That’s a MUCH bigger deal for people that rely on TB (all of my external SSDs are thunderbolt)
Are we tho?

What the Intel Macs had is irrelevant because so far no M1 Mac has more than two Thunderbolt ports but these schematics say there will be three. If anything we are gaining one Thunderbolt port.
 
Last edited:
For me the only thing I am more or less aligned on is the removal of the touch bar, but I use my MBP 13 TB 2016 mostly in clamshell mode on an external 4K desktop screen, and when using it alone, I find myself using the TB in basic fashion (sound and brightness mostly, who knows, maybe I would use it more if I used the MBP directly more ! :) )

As to the ports and SD card slot, I'm not sure ..

Magsafe ? Why not ? Tripping the cable can for sure happens, will a USB C port still be usable for charging ? I guess so, is this known for sure ?

SD card slot : I would say strange move, as basically there are several things moving :
  • Digital cameras being more and more the phone
  • Digital cameras (the ones remaining) trying to be more and more "connected" ("air" or easy cable data transfer)
  • Plus the new Card formats arriving, will that port support them ?
Overall I don't understand this move at all I would say.

HDMI : can see the use cases of course, but somehow would have preferred an ethernet port (or even more maybe a USB 3 port, still a lot of stuff for that)

3 USB C instead of 4 : can manage, how many will be thunderbolt ? (and label them clearly !!)

Overall I don't care that much, just bring that MBP 14" quickly ! :)

But I find these moves quite a bit "unapple" overall ..

Knowing that the CPU/SOC transition is also going on, of course!
 
Are we tho?

What the Intel Macs had is irrelevant because so far no M1 Mac has more than two Thunderbolt ports but these schematics say there will be three. If anything we are gaining one Thunderbolt port.
Yes. those machines didn’t have more than 2 TB ports to begin with. They had USB-C ports, not TB. The models that had 4 are still there in their Intel form. I really don’t see Apple having both M(whatever) and Intel 16” notebooks. By your logic you’re not gaining one either, because an M1 16” doesn’t exist.

4 TB ports is a selling point of the 16” MBP to many professionals, even if they don’t need a notebook. I have a few photographer and videographer friends that exclusively use TB external drives. It’s the Apple entry point to a machine with that capability (due to the GPU advantage over the lower machines with 4). Take away that capability and the cheapest computer Apple currently offers that can drive my 2 monitors is $6,000 (which, by the way, has 256GB of storage). As I’ve said a few times, I know I’m a rare case. But to keep my current setup and workflow, I need what the current 16” has or better. No other Macs can power my 2 monitors (other than Mac Pro). All of my external devices are Thunderbolt. I’ve already pre-ordered the new OWC TB hub in anticipation of losing a port. I can use a hub at my desk for the drives and whatnot, but not for the monitor. If these don’t have the GPU juice to drive 2 of those I won’t be transitioning any time soon (and will just need to find something else to spend the DTK credit on). My alternative is sell one of my screens and get an iMac (the 24” or whatever the bigger one ends up being). I’m very much considering it since my MBP is in clamshell 98% of the time anyway. I don’t care about the colors, the bezel, the chin. What’s going to drive me nuts is the screen size and resolution mismatch. I don’t know if I can overcome that, haha.

We also have an M1 MBA and love it for what it is. It can’t do everything I need all the time. So I’m hoping an Apple Silicon machine comes out that I can.
 

Wow, ok.

TV's and display boards are not going to remove HDMI. They aren't going to swap HDMI for Display port or even USB-C as main driver.

So?

Remember the Mini DisplayPort (and HDMI a few years later) adapters permanently taped or screwed to the VGA cables in every single conference room for years?

That solution worked fine, and if every single laptop sold starting now only had USB-C ports it would take a year for this to become standard, and another two or three years for the cables to be replaced.

Speaking of cables, and connecting computers to a TV, you can get a USB-to-HDMI cable for less than $10, pretty much the same cost as a decent HDMI cable. I have one connected to my TV.

It is a non-issue, and if all computers used USB-C exclusively the situations where a dongle would be impractical or annoying would disappear quickly. In this hypothetical future there will still be HDMI-only situations, but in these situations simply using a dongle or the correct cable would not be nearly as impractical or annoying as having that fat, ugly and useless port on every computer.
 
So either the next batch of information is late, or Apple/third party paid for the data back.
 
Wires hanging off a laptop are terrible experience. A thick HDMI cable to a monitor, USB cables to peripherals and also power make it impossible to use as laptop, turns it into a performance-throttled Mini (though at least with a battery). All that should connected to a laptop is one usb-c cable for power and video connected to a separate base CPU unit that also connects to the other monitors and peripherals. Like a dongle, but with the computer in it, for better cooling and with a BATTERY so it can be unplugged and taken anywhere. And the laptop could be unplugged from the base and used as a light powered laptop.
 
Your theory is, that eDP output can be converted to HDMI just fine and dandy, but it can't be routed to a TB controller?
When the TB controller is inside the SoC and the eDP has been deliberately wired to the outside - no. well, not without adding extra circuitry to the SoC which - when 4 out of the 5 systems using it need an always-connected built-in display, and the 5th is the cheapest Mac that you want to be limited - would be a waste of sand.

NB: I’m not even saying that the HDMI port in the new MBP definitely won’t be hard wired - but it cant be implemented the same way as in the M1 Mini - using the “unused” internal LCD connection - because, duh!, laptop. Nor can it be implemented exactly the same way as in the Intel Mini because, duh!, radically different GPU/CPU architecture. Its anybody’s guess - you’re the one insisting that it will absolutely definitely be hard wired. As I said, as long as the new GPU supports a reasonable number of displays, who cares? Well, you, but I suspect that if it supported 49 displays via Thunderbolt and 1 via HDMI you’d still be cut up about the terrible HDMI port “stealing” a display.

efore this week would you have thought it remotely plausible that Apple would use a magnetically attached power code on a desktop computer, with ethernet in the charging brick?

Of course it was plausible. Nobody predicted it, any more than they predicted white bezels, but if they had, everything about it would have made sense in terms of making a thinner computer - and Apple’s obsession with that is firmly established. (and now we have seen it, Ethernet in the 16” MBP power brick seems like a real possibility).

On the other hand Giving the M1X/M2 fewer TB controllers than the M1 would make no sense whatsoever. If you think that might realistically happen then you’re the one making the extraordinarily claim and you need to come up with some evidence for it.
 
So sad ;-{ Apple can't design a combo USB-C port with a set of magnets So its dual purpose!

While I want MagSafe back it shouldn't take the place of one of the Thunderbolt ports. And besides, having the rear ports dual use you could then power from the left or right sides.

Apple this isn't that hard! Really! I've already modified a case to see if it could be done.
 
  • Like
Reactions: randfee2
I suspect that if it supported 49 displays via Thunderbolt and 1 via HDMI you’d still be cut up about the terrible HDMI port “stealing” a display.
Let's get past supporting one external display before worrying about hyperbolic numbers shall we?

And yes, for any likely number of displays supported: if one is "forced" to be via HDMI that is a failure, in my opinion.

Given that the alternative to that hard wired HDMI port, is to offer support via TB3/USB-C/USB4 ports... which can still drive that HDMI display, using nothing more than a different cable, if you wish, and that the cables and adapters to do so are ubiquitous and cheap, hard-wiring a HDMI port is a failure, at any price point.


If you think that might realistically happen
Again, you're missing the point I was making. I wasn't disagreeing that Apple seem to be using 1 controller per port now. I was highlighting how you're happy to assume that behaviour will carry over, but you also assume other behaviour present in the same machines will not carry over, because it suits your narrative.


I see 1 controller per port aspect, and think "we might actually get a real world improvement in external I/O speeds", because four ports on four controllers is better than four ports on two controllers. And much better than two ports on two controllers.


You apparently see the 1 controller per port aspect, and use as a way to justify the removal of 2 ports, so you can <checks notes> use a different HDMI cable. Meanwhile you're apparently fine with the external I/O staying at the same levels it was 5 years ago, but available via fewer ports.


As you love to say: the hoops people will jump through to justify it....
 
This is the way.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.