Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
So - if we're waiting until WWDC for a new 14/16 MBP - I'm assuming Apple will use 5nm+(N5P) from TSMC, just as they supposedly are for the A15 - and call it M2.

That means the "M1x" benchmarks are bogus - because there should be at least some single core improvement (7%?) with the move from 5nm to 5nm+ - and CPUMonkey doesn't show that.
 
So - if we're waiting until WWDC for a new 14/16 MBP - I'm assuming Apple will use 5nm+(N5P) from TSMC, just as they supposedly are for the A15 - and call it M2.

That means the "M1x" benchmarks are bogus - because there should be at least some single core improvement (7%?) with the move from 5nm to 5nm+ - and CPUMonkey doesn't show that.
It’s going to be called M1, have more cores, and I am not sure which node it will use. But depending on whether they decide to re-do physical design, there may or may not be clock speed improvement. It also depends on what the critical path is.
 
The ridiculous irony of this post. Every god damn time it comes up "Pro users want HDMI. Pro users still need SD cards". Oh actually what I meant was, bubba Joe wants to plug in his cheapo no-brand HDMI tv as a bigger screen and $25 is a bridge too ****ing far so let's gimp this $3K top of the line laptop to suit him.

While I'm a huge fan overall of the move to USB-C and think adding A back would be a step backwards HDMI is a different story, a lot of professionals, including myself, have to present at conferences, meetings, etc often and having an HDMI port without needing to mess with adapters would be nice. The SD card would be useful to me personally, for raspberry pis and such, but it's less of a pro issue in general sure
 
It’s going to be called M1, have more cores, and I am not sure which node it will use. But depending on whether they decide to re-do physical design, there may or may not be clock speed improvement. It also depends on what the critical path is.
Wait,wait,wait,hold on , we didn’t put any fan schematics there , restole mission chapter 2 : codename ‘again ( someone in Apple ) .Just joking. Don’t wanna be ones when Apple find ‘m.
 
For now, yes.

Just like there were people carrying around USB floppy drives in 1998 and external DVD drives in 2008. That didn't make it a good reason to bring back those ancient technologies then, and this doesn't now.

The industry needs to move on, and bringing back legacy ports will just make the transition longer and more painful.

HDMI will disappear from conference rooms sooner or later, and end up where it belongs, alongside VGA and DVI.

This is asanine and out of touch with reality

HDMI is the most common port used on display technology around the world. it is the de-facto standard for wired display requirements on every single display and television available (and has been for well over a decade)

TV's and display boards are not going to remove HDMI. They aren't going to swap HDMI for Display port or even USB-C as main driver. Believing such is absolutely out of touch with reality.

This means there must be some means to provide physical connectivity. I'm honestly ok with the laptop having an USB-C to HDMI adapter. But to believe that somehow Apple's magical push (as a minority player in the PC market) is somehow going to change trillion dollar television and display board industry to drop HDMI? iyou're just evidencing you don't know what you're talking about

Wireless is also not a suitable replacement. EG: Meeting rooms. My network is highly secured and must meet regulatory compliancies for who can access it. I DO NOT and will NOT ever give access to my corporate network to visitors. This means if they want to present, they are plugging in. In addition wireless, unless perfect is not going to deliver the maximum bandwidth for 4k and HDR media in uncompressed formats (Like high quality bluerays)

HDMI is going nowhere (for now) and you need to accept that.
 
Last edited:
This is asanine and out of touch with reality

HDMI is the most common port used on display technology around the world. it is the de-facto standard for wired display requirements on every single display and television available

TV's and display boards are not going to remove HDMI. They aren't going to swap HDMI for Display port or even USB-C as main driver. Believing such is absolutely out of touch with reality.

This means there must be some means to provide physical connectivity. I'm honestly ok with the laptop having an USB-C to HDMI adapter. But to believe that somehow Apple's magical push (as a minority player in the PC market) is somehow going to change trillion dollar television and display board industry to drop HDMI? iyou're just evidencing you don't know what you're talking about

Wireless is also not a suitable replacement. EG: Meeting rooms. My network is highly secured and must meet regulatory compliancies for who can access it. I DO NOT and will NOT ever give access to my corporate network to visitors. This means if they want to present, they are plugging in.

HDMI is going nowhere (for now) and you need to accept that.
Exactly. We can all agree that Floppy discs and CD/DVDs are in the past now. USB A is also becoming obsolete. But HDMI is not going anywhere anytime soon.
 
Exactly. We can all agree that Floppy discs and CD/DVDs are in the past now. USB A is also becoming obsolete. But HDMI is not going anywhere anytime soon.

Floppies and USB-A were replaced by something thaty provided near identical experiences with more convenience and higher utilizations.

Floppies / DVD's were replaced by USB drives which provided all the benefits and none of the drive issues. smaller, more space and faster.

USB-C could offer this. it's not unversally accepted as a display port technology outside of the computer industry. This means that as long as TV manufacturers say HDMI is the standard, HDMI stays the standard. And they've been able to continue developing HDMI To be able to support all the latest display technologies and quality without need to change the physical port. This is only oging to keep it around longer as there's no technical reason why HDMI is inferiot.
 
Floppies and USB-A were replaced by something thaty provided near identical experiences with more convenience and higher utilizations.

Floppies / DVD's were replaced by USB drives which provided all the benefits and none of the drive issues. smaller, more space and faster.

USB-C could offer this. it's not unversally accepted as a display port technology outside of the computer industry. This means that as long as TV manufacturers say HDMI is the standard, HDMI stays the standard. And they've been able to continue developing HDMI To be able to support all the latest display technologies and quality without need to change the physical port. This is only oging to keep it around longer as there's no technical reason why HDMI is inferiot.
And Apple should be worried about changing lightning port to USB C.
 
  • Like
Reactions: LordVic
I literally don't care about a small increase in boot time since its such a rare occurrence for me and Im not staring at the screen while it boots. I did not notice any operational slow-downs moving from Catalina to Big Sur and no additional beachballs or instabilities. The upgrade has been a non-issue for me, even on my old computer.
You might be one of the rare few then. Almost unilaterally Big Sur creates a slower experience than previous OS's-which is standard-but the speed degradation from Catalina to Big Sur much larger than the previous decreases in performance.
 
Wireless is also not a suitable replacement. EG: Meeting rooms. My network is highly secured and must meet regulatory compliancies for who can access it. I DO NOT and will NOT ever give access to my corporate network to visitors. This means if they want to present, they are plugging in. In addition wireless, unless perfect is not going to deliver the maximum bandwidth for 4k and HDR media in uncompressed formats (Like high quality bluerays)
It is suitable. No meeting room in any corporate environment is tied to it's infrastructure. It runs on it's own wireless completely detached and not connected. No one is sending HDR media uncompressed like bluerays in a meeting room.
4k is easily done on wireless, but again, no one expects that. At most 1920 is done for readability reasons. Sometimes even lower. This includes slides, video, pp, excel, etc.
We went wireless 5 years ago and it's been a godsend because every vendor or employee has some laptop that doesn't have the correct connection. Keeping a bunch of adapters around was not a good solution. Either had to tie them down or keep replacing missing adapters.
 
You might be one of the rare few then. Almost unilaterally Big Sur creates a slower experience than previous OS's-which is standard-but the speed degradation from Catalina to Big Sur much larger than the previous decreases in performance.
We all know that internet complaints are a minority rather than a majority.
 
It is suitable. No meeting room in any corporate environment is tied to it's infrastructure. It runs on it's own wireless completely detached and not connected. No one is sending HDR media uncompressed like bluerays in a meeting room.
4k is easily done on wireless, but again, no one expects that. At most 1920 is done for readability reasons. Sometimes even lower. This includes slides, video, pp, excel, etc.
We went wireless 5 years ago and it's been a godsend because every vendor or employee has some laptop that doesn't have the correct connection. Keeping a bunch of adapters around was not a good solution. Either had to tie them down or keep replacing missing adapters.

But all your'e showing is the disconnect between use cases and why replacing HDMI isn't happening anytime soon

TV makers aren't making displays with exclusivity purposes. they're making general purpose that have to support a wide range of connectivity. With the standard on billions of devices already being HDMI. HDMI is capable of the full range of supported technologies. from the most feature rich HDR 8k content, to simple presentations with almost no dynamic content.

replacing it with USB-C would be a lateral movement that has no real benefit to the TV technologies. I Think every tv having USB input in addition to HDMI would be a good step. But the idea that HDMI is a legacy port and needs to go away is in complete opposition to the billions, if not trillions of devices actively using it. the ubiquity of it's availbility, and it's technological support.
 
  • Like
Reactions: ipponrg and pdr733
This means that as long as TV manufacturers say HDMI is the standard, HDMI stays the standard.
Sure, for TVs.

they've been able to continue developing HDMI To be able to support all the latest display technologies and quality without need to change the physical port.
Which physical port would that be? Type A, B, C, D or E?

Also what cable type? Can you tell me what resolution, colour depth, and refresh rate each of these supports?

  • Standard HDMI Cable
  • Standard HDMI Cable with Ethernet
  • Standard Automotive HDMI Cable
  • High Speed HDMI Cable
  • High Speed HDMI Cable with Ethernet
  • Premium High Speed HDMI Cable
  • Premium High Speed HDMI Cable with Ethernet
  • Ultra High Speed HDMI Cable (48G Cable)
Also, HDMI as a general rule, never supports "the latest display technology" until well after DisplayPort.

Why do you think high end displays all use DisplayPort, and even they even have HDMI, it only runs at 4K? This has been a constant story for years.

This is only oging to keep it around longer as there's no technical reason why HDMI is inferiot.
No one said you can't use HDMI devices, but to pretend that it's technically superior is laughable.


My network is highly secured and must meet regulatory compliancies for who can access it. I DO NOT and will NOT ever give access to my corporate network to visitors. This means if they want to present, they are plugging in.
You know that you can have more than one Wifi network in the same location right?

In addition wireless, unless perfect is not going to deliver the maximum bandwidth for 4k and HDR media in uncompressed formats (Like high quality bluerays)
Who is playing "high quality bluerays[sic]" in a conference room?

AirPlay or similar is the ideal solution for a meeting room. If all you're doing is showing a presentation, a god damn iPhone could do that. I don't see anyone clamouring for HDMI ports on a ****ing iPhone.
 
replacing it with USB-C would be a lateral movement that has no real benefit to the TV technologies.

Who ever mentioned removing HDMI from TVs?

Someone said conference rooms will move on - and he's right. Until that time, an adapter is the right solution.

Hardwiring a single use port and stealing a video output that could otherwise drive a real monitor, is a **** solution to a made up problem, to appease entitled whiners.
 
For me the only thing I am more or less aligned on is the removal of the touch bar, but I use my MBP 13 TB 2016 mostly in clamshell mode on an external 4K desktop screen, and when using it alone, I find myself using the TB in basic fashion (sound and brightness mostly, who knows, maybe I would use it more if I used the MBP directly more ! :) )

As to the ports and SD card slot, I'm not sure ..

Magsafe ? Why not ? Tripping the cable can for sure happens, will a USB C port still be usable for charging ? I guess so, is this known for sure ?

SD card slot : I would say strange move, as basically there are several things moving :
  • Digital cameras being more and more the phone
  • Digital cameras (the ones remaining) trying to be more and more "connected" ("air" or easy cable data transfer)
  • Plus the new Card formats arriving, will that port support them ?
Overall I don't understand this move at all I would say.

HDMI : can see the use cases of course, but somehow would have preferred an ethernet port (or even more maybe a USB 3 port, still a lot of stuff for that)

3 USB C instead of 4 : can manage, how many will be thunderbolt ? (and label them clearly !!)

Overall I don't care that much, just bring that MBP 14" quickly ! :)

But I find these moves quite a bit "unapple" overall ..
 
Last edited:
  • Like
Reactions: Nütztjanix
Yet now you're happy to suggest that those who actually have use for those USB-C ports, should use a USB4 hub, to regain said ports. The cheapest USB4 hub I've seen is $220, and every one I've seen requires AC power.
No reason you can't have a bus-powered USB4 hub. They've only been a possibility since last November, so its not surprising that there's a poor choice of USB-4 hubs just at the moment. You know, a bit like anything with USB-C or TB3 when the 2016 MBP came out... Of course, the powered ones are designed to power a 13" MBP/Air as well, eliminate the power brick and give you 1-port docking on the desktop.

Unless, of course, hardly anybody actually wants more TB/USB-C ports - just their USB-A, HDMI and SD ports back - which is what the preponderance of "legacy" hub/docks on the market (vs. the lack of even regular USB-C/3.1 hubs) is screaming out to anybody prepared to listen. But I get it - people needing external hub/docks adapters is perfectly fine as long as it happens to other people.

But if you were to say, plug a DisplayPort display into one port and a PCIe device into the other, on one side: there's no contention, there's no competition for bandwidth.

In a 2-TB4 port laptop you lose that, because the moment you go externally to a Hub, you're forcing it to multiplex the two.
Yes, you've just neatly encapsulated why combining data, display and power - which don't need to share any resources - into a single port is a bloody stupid idea unless you're making a phone that only has space for one port (the one place that Apple don't use USB-C/TB)... but we seem to be stuck with it anyway. In reality, multiplexing isn't an issue unless you're really pulling 40Gb/s out of every port, and if you don't use it you're stuck with only one DP stream per port, whereas DP-over-Thunderbolt can support multiple/MST displays.

The M1 Mini has hardwired HDMI - but you continually dismiss the idea that a MBP would do this.
Hardwiring on the mini is only a problem because the M1 only supports a single display over Thunderbolt (and maybe that's all the GPU can manage, esp. as that can now be a 6K display) - and as people keep explaining to you the hardwired HDMI is there on the Mini because the M1 has a hardwired output to drive a laptop display. If the M1X/M2/whatever in the new 16" MBP doesn't match the old 16" by supporting dual 6K (Thunderbolt) displays or 4 4K displays, then it's going to get shredded. Worst case - it only supports 3 x 4K displays (oh, the humanity!) via Thunderbolt and #4 has to be HDMI then, frankly, that's a non-issue - most 4k displays have HDMI in, anyhow.

But then you also keep insisting that because the current M1 Macs have 1 TB controller per port, all future M-whatever Macs will also have 1 controller per port.
Seriously? You think it is remotely plausible that Apple are going to put 2 TB controllers in the entry-level M1 chip and then only put one in the "pro" M1X/M2 chip? Anything is possible - but some things are just ridiculously unlikely.

I mean, Apple could really cheap out and just put a slightly overclocked M1 into the new 16" MBP - I'd happily join you in the chorus of derision - but even then they'd have 2 TB controllers.

Given this leak shows 3 USB-C-type ports the only real question is whether the third port is (a) just a USB port, as per the M1 iMacs (b) sharing a TB controller with port 2 or (c) has a third TB controller. I'm gonna guess (a), but (b) or (c) would solve your display problem...
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.