Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Again, instead of dodging the question, how about answering it? What did removing the headphone jack allow for? One key example, please.

I’ll say it again:

“The continued presence of the headphone port would take away from other stuff that made it into the iPhone 7 (and newer) instead.”

It doesn’t matter specifically what it made room for. It made room - space. There’s an awful lot of technology, circuitry, battery, and other stuff in these iPhones. If you put the headphone jack back in you have to take something else out. So you tell me. You obviously know what’s in there far better than my ignorant self. What should they take out to put the headphone jack back in? Or did they just take it out to save money like you said, and leave the space empty?
 
Lightning does not offer superior sound. In fact, it does not send any "sound" like a traditional headphone. It sends a pure digital signal, the same as the USB A to USB C cable I have going from my Mac to my Fiio K3. Removing the headphone jack also removed the DAC, and if you think the DAC in the lightning earbuds is equal to the DAC driving your Mac, you have really bad hearing. If you wanna come at an audiophile with a defense of Lightning, try first learning what Lightning actually does.

Lightning's benefits have NOTHING to do with the removal of the DAC in the iPhone. They went from including a high quality DAC built into the phone, to... a crappy one in dongle. Bluetooth audio, while convenient, is noticeably lower quality than traditional high end wired cans'.

Umm... why are lightning and Macs in the same sentence and comparison? Lightning has nothing to do with the Mac except for charging their Bluetooth input devices.

Second, it’s a phone for goodness sake, not a high end audio system. 99.9% (yes I made that statistic up but the point is it’s the vast majority) of the market for these devices get their music downloaded or streamed from online sources with compression algorithms that marginally degrade the sound. True lossless at its best brings a CD quality audio file down to about half it’s size. The amount of information lost encoding to the 256 mbit aac or mp3 format typically used for music on the vast majority of iPhones results in significantly more noticeable loss in quality than the difference between the lightning based DAC and whatever DAC was inside the phone before that.

Now if you’re the sort of person only listening to lossless audio on your phone then frankly, you’re not Apple’s target market, and they really couldn’t give a rip about you. And why should they? Like I said, it’s a handheld device not a high end hi-fi system. If you want a high end professional sounding audio system in your hand then the iPhone (and it’s predecessor, the iPod) was never the right product for you - headphone jack or not.

All that said... you started this with your ridiculous claim that Apple is removing stuff just to save money. Maybe them removing the headphone jack was a good idea or maybe it wasn’t, but as I said in my first reply to you: by all means have an opinion about the merits of these decisions and how they affect you if you want, but the idea that they’re swapping old ports for newer tech in these devices just to try to save money... it’s utter nonsense and anything else you say after that has pretty limited credibility.
 
Last edited:
So I think it's more about product line than capability, where the one functional distinguishing characteristic seems to be ISV certification.

Sounds like it has zero technical merit, then. They’re basically saying Apple could paint an MBP blue, call it the MacBook Enterprise Office Advanced, and suddenly have a “workstation laptop”.

Note also that we're focusing here on a minor, parenthetical point from the first paragraph of my last post, rather than its key big-picture message: that, if you want to know what ports pros who purchase high-end laptops want, look for commonalities in the high-end offerings of vendors who serve that market.

OK, fair point.

That's a good point about mobile vs stationary use when it comes to HDMI. Indeed, I've made the point myself that HDMI is valuable if you have to give lectures in which you connect your laptop to a typical projector. And it's not just about having the right connector. It's that you're less likely to encounter problems if you can go HDMI->HDMI rather than TB->HDMI. A projector where I lecture works well with the former, but gives me intermittent snow with the latter.

That sounds like a bad adapter. It’s just a signaling mode switch; you’re literally sending the same thing with a different plug on one side. The projector wouldn’t even know.
 
  • Like
Reactions: Detnator
Umm... why are lightning and Macs in the same sentence and comparison. Lightning has nothing to do with the Mac?

So to recap, there is no benefit to Lightning audio sourcing, removing the jack and DAC made room for... something something, and the phone is thinner. They did it to make the phone thinner and to increase the profit margins per device. That's all. Just like removing MagSafe (and goose Bluetooth Apple Earbud product sales and Lightning royalties for 3rd party Lightning headphones).

These straw man arguments that people who like useful ports are niche users is really rich. The only reason I brought up the original headphone jack removal on the iphone in this MBP thread was to point to another product where Apple had seemingly sacrificed useful functionality simply for $, not for actual improved UX.

These are not limited design hangups among users with no "credibility" (whatever that means; I've been an Apple user since 1982). The Magic Mouse 2 bottom charging port, the Apple pencil charging port, these were ridiculously bad decisions that Apple quickly fixed. Apple switched from longtime proprietary standard ADB to the more popular USB standard. So it's reasonable for dedicated users on this forum to hope for Apple to make good/smart changes when issues arise.

PS You dont need to be an audiophile to know that the wired audio experience of listening even to youtube is not as good via dongle vs the good old Apple DAC-powered headphone jack. And that's using $20 Phillips headphones from CVS, not some crazy expensive open back pair.

By the way, every single streaming service, including Apple Music, offers a high end quality streaming option. In fact, it is the last profit driver tool left to most of the services (Amazon is rumored to be launching a separate high end audio service this year). It will be an important selling point going forward. Incredible, I know.
 
Last edited:
(Can someone remind me what Lightning has to do with a 16-inch MacBook Pro?)
Nothing! It was a tiny referenced comment in the larger thread about how the disappointing port options on MBP seemed to be in fiscal alignment with other product design choices from Apple as of late. The Jony Ive stories from the WSJ seem to buttress that argument, although it is complicated to know how much Ive was a positive/negative counter to the current design ethos (the truth is probably in between). Lightning only became more prominent in this thread when there were inaccurate claims about Lightning being made as way to tout one's post as more credible than others.

The real issue is why does Apple say T3/USB-C is the future standard for the flagship laptop machines (to say nothing of why it's no longer important to protect your machine from someone tripping on the power cord), but is perfectly fine to continue with USB-A as the standard for THE flagship device for the entire company.

As for SD cards, these are the tools for photography and video professionals. If SD cards were no longer being improved, I would agree to drop the slot. But the cards are getting cheaper, faster, and growing in storage capacity. It seems like a fairly small concession to make to a very viable external storage format that is easier and more suitable in many situations than plugging in a full-sized external drive (especially when cloud backup is not available).

As other sites have pointed out, Apple's future revenue growth is in services, and theyre not afraid to use the devices to force people to those services (which is not evil or wrong to do in a general sense). But it seems to be getting a little intrusive with the hard sell.
 
Last edited:
Sounds like it has zero technical merit, then. They’re basically saying Apple could paint an MBP blue, call it the MacBook Enterprise Office Advanced, and suddenly have a “workstation laptop”.

It's possible you're getting things backwards here. Recall your concern was that IDC's definition of mobile workstation was highly restrictive (say, Xeons only), such that being the best-selling mobile workstation merely meant being the biggest fish in a small pond. Hence the idea that it's easy to get called a mobile workstation mitigates against your concern (meaning the definition isn't that restrictive), rather than supporting it.

That sounds like a bad adapter. It’s just a signaling mode switch; you’re literally sending the same thing with a different plug on one side. The projector wouldn’t even know.

Thanks for the info., and not saying you're wrong (no idea myself, and it's certainly worth trying another adapter), but I've read elsewhere it's not that straightforward-- the DP output has to properly switch to HDMI compatibility mode, which adds an extra layer of complexity. And there may be other stuff going on as well:

"DisplayPort natively outputs in a LVDS signal type that is not compatible with HDMI (HDMI uses TMDS). It does have a dual-mode version that will support TMDS in compatibility mode."
[ https://www.tempest-av.com/single-p...-and-DisplayPort-to-HDMI-Conversion-Explained ]

"The DisplayPort standard specifies an “HDMI compatibility mode”, where the signal format reverts back to HDMI type interface. However, not all DisplayPort devices are required to support this feature."
[ https://www.magenta-research.com/files/AN-11-006_DisplayPort_to_HDMI_Conversion_App_Note-1.0.pdf ]

See also:
https://apple.stackexchange.com/questions/281640/mini-displayport-to-hdmi-cable-vs-hdmi-to-hdmi



I'll need to look into this further.
 
Last edited:
The problem with SD is that it's no longer the only standard among pro cameras. The newer Nikons use XQD, either exclusively or as a preferred format. Canon uses CFast in some cameras, as do Hasselblad and a number of video-oriented companies. Sony still cameras and many others are still using SD almost exclusively. Both Canon and Nikon may move to CFExpress (easy for Nikon - it's just a firmware upgrade from XQD to CFExpress). The CFast companies, including Canon, would have to change hardware, but probably will end up on CFExpress as well.

Pro video is a mess of formats, everything (including SD, of course - a few cameras even use arrays of SD cards) from custom-housed SSDs (SATA and NVMe) to Sony SxS and Panasonic P2 cards. The idea is that CFExpress is capacious and fast enough to replace almost all of it, and that we'll see SD on the consumer end, CFExpress in most pro-oriented gear (photo and video), and housed NVMe SSDs for some very high end video applications. We'll have a mess for a few years more, though. Even when it resolves, both SD and CFExpress will be common among MacBook Pro users. Probably the best solution is going to end up being external readers - USB 3.1 is fast enough for any single card, while Thunderbolt 3 is there for high-end multi-card readers.

I'd love to see one USB-A port, mainly for convenience uses like memory sticks. HDMI makes sense, primarily for dongle-free projector connections. Ethernet won't fit any conceivable body (it's limited to the thickest versions of competing mobile workstations).
 
  • Like
Reactions: PickUrPoison
The keyboard is a significant concern, particularly given that it's the only keyboard across the entire MacBook line-up. In other words, if you want any laptop at all running macOS, you're stuck with this keyboard (barring external ones). Given that, it better be a pretty damn good keyboard, and reliability-wise, that just doesn't seem to be the case.

The ports, meh. Dongles are less than ideal, but not that big a deal.
[doublepost=1562105071][/doublepost]

USB-A I can see. Ethernet? No frigging way. Maybe (that's quite a stretch) as an early MacBook Air-style trap door. But to put an Ethernet port on this model would make it quite a niche variant, not the general-purpose high-end laptop. It'd simply be too thick.

Now maybe, maybe they'll bundle in a 10 GigE Thunderbolt adapter. The rMBPs had a 1 GigE Thunderbolt adapter bundled in (I believe? Maybe I bought it?).



Uhhhh HDMI is way more widespread among projectors, TVs and displays than DisplayPort will ever be.

It's also much thinner than Ethernet. If they do add back some ports, they'll largely be the ones the rMBP had, i.e. USB-A, HDMI, SD.

The new 16" MacBook Pro is going to get thicker! It has to in order to get the needed space for a more aggressive heatsink and cooling system.

Then the issue of thinness is moot! There will be plenty of space! BTW systems which are thinner offer Ethernet without any hatch and again Ethernet is faster than WiFi.
 
The new 16" MacBook Pro is going to get thicker! It has to in order to get the needed space for a more aggressive heatsink and cooling system.

I wouldn't be surprised if it gets slightly thicker to accommodate a better keyboard.

I'm not sure on what you are basing the assumption that there will be "a more aggressive heatsink and cooling system". The current MBP already ships with i9 CPUs, and no rumors that I can see have made mention of this.

Then the issue of thinness is moot!

No it isn't.

It wouldn't just have to be thicker than the current 1.55cm. It would also have to be thicker than the previous-gen (2012!) 1.8cm. The last models to ship an Ethernet port were 2.4-2.5cm thick.

Maybe they can squeeze it into something thinner than that, but I think you're setting yourself up for failure if you're betting on them doing it.

There will be plenty of space! BTW systems which are thinner offer Ethernet without any hatch and again Ethernet is faster than WiFi.

Sure, and a Xeon-W CPU is faster than a Core-H, but that doesn't mean the MacBook Pro will ship with one.
 
  • Like
Reactions: Detnator
The benefits of everything soldered in are pretty substantial. Not only does it facilitate putting more power in a smaller space, there’s much less points of failure when soldered vs pushed into a slot.

Everyone complains about lack of upgrade ability but I’ve been upgrading my Macs for years at very reasonable cost.

1. Macs hold their value much better than anything else in the market.
2. The migration assistant makes transferring everything from one Mac to another very seamless.

So to upgrade your Mac you sell your old one and buy a new one (or less old one). For me, the difference in price rarely differs much if at all from the unit price of whatever extra pice of hardware (RAM stick, new SSD, whatever) I’d have bought to upgrade. And I get a new warranty every time I do that.

It’s a different mindset, but it works. This whole “can’t upgrade” thing is a non issue when you actually try it this way.

Having serviceable RAM or Storage Vs Soldered doesn't have any effect with power! While I'll agree with you the failure of the connector might be higher, I can tell you I've seen to many soldered RAM and flash failures as well so in my mind its a bit of a wash.

Where are you buying your new systems? Apple charges quite a lot for RAM as well as storage in their Laptop's. Desktops are a lot different.
 
I wouldn't be surprised if it gets slightly thicker to accommodate a better keyboard.

I'm not sure on what you are basing the assumption that there will be "a more aggressive heatsink and cooling system". The current MBP already ships with i9 CPUs, and no rumors that I can see have made mention of this.



No it isn't.

It wouldn't just have to be thicker than the current 1.55cm. It would also have to be thicker than the previous-gen (2012!) 1.8cm. The last models to ship an Ethernet port were 2.4-2.5cm thick.

Maybe they can squeeze it into something thinner than that, but I think you're setting yourself up for failure if you're betting on them doing it.

Sure, and a Xeon-W CPU is faster than a Core-H, but that doesn't mean the MacBook Pro will ship with one.

The new 16" MacBook Pro is intended to serve the higher end Pro market the current MacBook Pro's just can't serve.

If you really study things you will see even the newest 2019 models are thermally constrained! At least they make the base clock, the 2018 i9 model didn't! Even after the firmware fix.

The upper Pro market wants to leverage more of the i9 or what ever CPU Apple goes with performance. Intel is not able to get out the 10nm CPU's Apple needs for at least another two years. So if you can't lower the thermals by die shrinkage then you need to deal with the needed cooling the current crop of CPU's are available. That means bigger case for the bigger heat sink and fan system.

One important point often times even with a CPU die shrink the gain is real-estate. Which used is often where new stuff is added which then raises the thermal need again! While the first versions will likely be OK, the 2nd and third gen will steal back the gain so we are back in the same boat!

Yes, there is a strong market for the size and weight of the current MacBook Pro. There is also a very large market Apple left behind! These are the folks like me that need the higher performance the system could offer if properly sized and giving us back what made the MacBook Pro a Pro's system >> Ports onboard.

I think you think its an either or issue here, I don't!

Its two different lines just like the MacBook Air meets a market there is a market that wants what we had, yet want the better CPU's, screens and security. I'm hoping thats what Apple see's.
 
The new 16" MacBook Pro is intended to serve the higher end Pro market the current MacBook Pro's just can't serve.

We don't actually know this. (Heck, we don't know this product exists at all.)

And if we did, we still wouldn't know how far that goes. It could go towards Xeon options like the 2286M.

Intel is not able to get out the 10nm CPU's Apple needs for at least another two years.

Yes, looks that way.

So if you can't lower the thermals by die shrinkage then you need to deal with the needed cooling the current crop of CPU's are available. That means bigger case for the bigger heat sink and fan system.

This would eke out a few more percent of CPU at the cost of a completely different chassis. It doesn't seem to me like that's a niche Apple is interested in. That's different than the Mac Pro and iMac Pro; those are also a niche, but they yield way higher performance than their non-Pro relatives.

Yes, there is a strong market for the size and weight of the current MacBook Pro. There is also a very large market Apple left behind!

Yes, there are market segments Apple has left behind. Like a below-$6k tower. They know. They've clearly decided they're not interested. (They might change that decision, but I don't see it.)
 
Last edited:
  • Like
Reactions: 09872738
This would eke out a few more percent of CPU at the cost of a completely different chassis. It doesn't seem to me like that's a niche Apple is interested in. That's different than the Mac Pro and iMac Pro; those are also a niche, but they yield way higher performance than their non-Pro relatives.

I'd say it's not just about getting more out of existing processors, but being able to put in more powerful ones, in particular more powerful mobile GPU's.

Having said that, do we have any data on what the actual percent increase in CPU and GPU performance would be if we took the top CPU and GPU in the current MBP and moved them to a much more thermally dissipative chassis? Are they merely "a few more percent"? E.g., have you found any benchmarks comparing the top-of-the-line MBP with that of the same GPU &CPU in, say, a generously cooled gaming laptop or mobile workstation? [GPU might be tough -- most higher-end laptops seem to use NVIDIA.] I'm genuinely curious what those numbers would be.

Of course, if they *really* went with a completely different chassis, they could get a substantial performance increase by going with desktop processors (as Dell does with its Alienware Area 51 laptops). Granted, that seems highly unlikely, but it is fun to mention.
 
Last edited:
  • Like
Reactions: ntlman
Apple would never use a desktop processor - that would mean accepting a sub-2 hour battery life! The gaming notebooks that use desktop CPUs (and usually have huge GPUs) use 99 Wh batteries and, even so, get one to two hours on a charge. They also tend to use 300 watt power adapters and sometimes even dual power adapters.
 
  • Like
Reactions: ntlman
I'd say it's not just about getting more out of existing processors, but being able to put in more powerful ones, in particular more powerful mobile GPU's.

Yeah, that's true. There's not that much headroom on the CPU front, but there is for GPUs — if they make the device thicker and give it more battery (or are willing to give up a significant chunk of battery life).

Having said that, do we have any data on what the actual percent increase in CPU and GPU performance would be if we took the top CPU and GPU in the current MBP and moved them to a much more thermally dissipative chassis? Are they merely "a few more percent"? E.g., have you found any benchmarks comparing the top-of-the-line MBP with that of the same GPU &CPU in, say, a generously cooled gaming laptop or mobile workstation? [GPU might be tough -- most higher-end laptops seem to use NVIDIA.] I'm genuinely curious what those numbers would be.

TL;DR: it depends on your workload.

The closest data I could find, headline-wise, would be Bare Feats's "Does the 2019 MacBook Pro 15-inch 2.3GHz 8-core exhibit thermal throttling?".

It's not that great an article, though. Does the DaVinci graph show all four runs? Where did each of them start? Why does the text say "the core frequency dropped as low as 2.39GHz, a tad above the base frequency of 2.3GHz" when it appeared to actually drop to roughly 2.0 GHz, below base? Why doesn't the LuxMark graph show temperature?

I also find their conclusion a little lacking:

It could be argued that as long as you remain above the base frequency (or 2.3GHz in this case), your 2019 MacBook Pro is not experiencing 'detrimental' thermal down throttling.

I'm actually a bit confused because the DaVinci graphs don't seem to support this — clearly, the CPU briefly drops to about 2.0 GHz before it recovers all the way to about 3.9 GHz? And this does seem to correlate with reaching almost 100 degrees C. There's a fair amount of difference between ~3.5 GHz and ~2.1 GHz.

OTOH, that's clearly Turbo Boost. This depends on a lot of variables, really. The 9880H used here actually goes up to 4.8 GHz, but we don't really know if it isn't doing that here because of thermal throttling, or because DaVinci uses too many cores for that to possible (perhaps ~3.6 GHz is the max for whichever many cores DaVinci uses).

The same page shows results in Blender and LuxMark, and here we see a different picture: for Blender, the frequency never reaches 3 GHz, but remains stable at a slight turbo of 2.8 GHz, and an also fairly stable temperate of 90 degrees C. With the LuxMark benchmark, we don't get temperate results (why not?), and it's unclear if temperate is the reason it eventually goes down to ~2.3 GHz.

So the short answer is: this model does appear to sustain its 2.3 GHz base clock across all eight cores… mostly (the exception being whatever the hell is going on in the middle of that DaVinci graph). That Blender graph looks pretty good.

Where the answer gets trickier is if fewer cores than eight are used and/or turbo gets enabled, both of which you'll frequently see in everyday use (very, very little code out there can make good use of four cores, let alone eight).

Of course, if they *really* went with a completely different chassis, they could get a substantial performance increase by going with desktop processors (ask Dell does with its Alienware Area 51 laptops). Granted, that seems highly unlikely, but it is fun to mention.

Yeah, no way.

(He says, wondering what impact Ive leaving will have.)
 
  • Like
Reactions: ntlman
The problem with SD is that it's no longer the only standard among pro cameras. The newer Nikons use XQD, either exclusively or as a preferred format. Canon uses CFast in some cameras, as do Hasselblad and a number of video-oriented companies. Sony still cameras and many others are still using SD almost exclusively. Both Canon and Nikon may move to CFExpress (easy for Nikon - it's just a firmware upgrade from XQD to CFExpress). The CFast companies, including Canon, would have to change hardware, but probably will end up on CFExpress as well.

Pro video is a mess of formats, everything (including SD, of course - a few cameras even use arrays of SD cards) from custom-housed SSDs (SATA and NVMe) to Sony SxS and Panasonic P2 cards. The idea is that CFExpress is capacious and fast enough to replace almost all of it, and that we'll see SD on the consumer end, CFExpress in most pro-oriented gear (photo and video), and housed NVMe SSDs for some very high end video applications. We'll have a mess for a few years more, though. Even when it resolves, both SD and CFExpress will be common among MacBook Pro users. Probably the best solution is going to end up being external readers - USB 3.1 is fast enough for any single card, while Thunderbolt 3 is there for high-end multi-card readers.

I'd love to see one USB-A port, mainly for convenience uses like memory sticks. HDMI makes sense, primarily for dongle-free projector connections. Ethernet won't fit any conceivable body (it's limited to the thickest versions of competing mobile workstations).

Dude, posts like yours are why I love this site and forums. Thanks for taking the time to post this.

Personally, MagSafe is the only port decision (exclusion?) for me that had me worried about a larger philosophy shift with Apple leadership.
 
At present, there is literally no headroom to put a faster CPU in the 15" (16"), and there won't be for a while.Intel's so-called Mobile Xeons are just Core i7s and i9s with ECC RAM (same clock speeds and core counts). I'm not sure if they're binned differently, but I suspect not, because they don't seem to be a lot more expensive. Apple already uses the fastest Mobile i9, and the idea of a desktop processor is a non-starter in any reasonable notebook. They already make a 12 lb mobile desktop with a desktop CPU - it's called the 21.5" iMac. Many desktop processor gaming notebooks are actually heavier than the little iMac, especially when you count their power supplies.

They do have headroom on the GPU, since AMD seems to have something close to release that might work (there are a number of unreleased Vega and Navi serial numbers around, and some of them seem to be mobile). Are they faster /higher end, or are they evolutions of what Apple's using? A 7 nm evolution of the Vega 16/20, even if there is little else new, could run somewhat faster and help battery life.

The idea of a NVidia GPU is as unlikely as a desktop CPU, since neither Apple nor NVidia has a reasonable driver - Apple is very happy with their AMD driver, and they are right to be. It's pretty much a workstation-grade driver (although without independent certification). If NVidia ported anything, it would be their much less stable gaming driver, unless Apple paid for Quadros.

I'm almost sure that the very stable AMD driver is something Apple wrote and maintain themselves . For whatever reason, good or bad, Apple is either unwilling or unable to do the same for NVidia cards - it could be NVidia having a license agreement (if you use our cards, you have to use our driver). This would make sense to protect the expensive Quadro line, which are basically binned GeForces with highly stable drivers (mobile Quadros are nothing more than that, some desktop parts are slightly different in real specifications, although never close to justifying their price premiums).

If Macs were using GeForces as Quadros, it would give Apple a huge price advantage over HP and other workstation vendors who pay the Quadro premium (and perhaps encourage HP to write a "GeForce as Quadro" driver). NVidia could be saying "we'll sell you GeForces, but we'll only port the gaming driver, and you can't use your own" - if you want a more stable driver, you pay for Quadros. Apple is a big enough fish in AMDs pond that AMD is letting Apple use their own driver.

Of course, it could also be Apple simply being lazy! NVidia could have no objection to Apple using any driver they want (and there could be no technical obstacle to a stable Apple driver for GeForces), but Apple might be saying "we already have a nice Mac GPU driver, let's task those developers with designing more Memojis instead of writing an NVidia driver". I wouldn't put it past them.

I'm not sure what to make of the rumor of a scissor keyboard on the Air? It was the most recent portable refresh (Retina, USB-C) only a few months ago. It doesn't seem like the likely candidate, with the MacBook overdue for a refresh and the MBP coming up on its own.

I'm liking the idea of a new MBP with better thermals, a scissor keyboard and maybe a new GPU. The CPUs Apple's using are not only the best they can come up with, they're very, very good. If it has an HDMI port, USB-A (just one port - it's for memory sticks) or both, so much the better.
 
  • Like
Reactions: ntlman
We don't actually know this. (Heck, we don't know this product exists at all.)

And if we did, we still wouldn't know how far that goes. It could go towards Xeon options like the 2286M.



Yes, looks that way.



This would eke out a few more percent of CPU at the cost of a completely different chassis. It doesn't seem to me like that's a niche Apple is interested in. That's different than the Mac Pro and iMac Pro; those are also a niche, but they yield way higher performance than their non-Pro relatives.



Yes, there are market segments Apple has left behind. Like a below-$6k tower. They know. They've clearly decided they're not interested. (They might change that decision, but I don't see it.)

Rumors have implied the newer system would be a more powerful system. Sure a Xeon would be one possibility, but they will need more cooling!

As for die shrink: It's a lot more than a few percent! Previous shrinks have gotten better than 25% improvement in TDP.

I fully realize you want a thinner and still thinner MacBook Thats fine! Others want a more usable portable MacBook Pro.

These are different markets and the thinner models is fully fleshed out between the MacBook & MacBook Air systems as well as the current MacBook Pro. The other direction is what's needed.

I have a very strong feeling this is the direction Apple is going. Clearly the pendulum is swinging back to Function over Form. Form is also important! But, not at the cost of a non-working or thermally limited designed system. That's what needs to be focused in on now, and clearly Apple has arrived to the same conclusion with the coming scissor keyboard.

As far as thickness we are not talking about a triple thick system here! Which I think is your fear.

This is not a gamers laptop and that's not what a working pro wants either in the field. This gets back to the balance of thin vs thick. The move is to roll back a bit to the mid point getting the needed space for the new scissor keyboard, larger battery, and for the more advanced systems I'm looking for, better cooling so the system is not thermally constrained and bringing back the ports we as pro's need.

The holes in the line up is because of the size of the design team they just can't make that many different systems at the same time. Apple thinks through each system design with great effort which is why it just takes longer.

So Apple has gone very high with the Mac Pro. I know a few people who are just waiting! These are film and sound engineers and artist which is what this system was designed for. Maybe Apple will intro a system in-between the Mac Mini and the new Mac Pro in a year or so. The older 2013 Mac Pro currently fills that spot.

At this point I think arguing over how many angels are sitting on a pin head is pointless!
 
Last edited:
I'm not sure what to make of the rumor of a scissor keyboard on the Air? It was the most recent portable refresh (Retina, USB-C) only a few months ago. It doesn't seem like the likely candidate, with the MacBook overdue for a refresh and the MBP coming up on its own.

I'm not sure the 12-inch MacBook is long for this world now that the Y-series Air is a thing. For one, the marketing doesn't make much sense (it arguably never did, but especially not now that the Air is so similar).

They probably want to change something about it, which would explain why, by tomorrow, it won't have seen an update in exactly two years.

Maybe they'll rebrand it as 12-inch MacBook Air and otherwise leave it as is. Maybe they'll make it more like the 13-inch Air in the process: newer CPU/GPU, T2, 720p instead of 480p camera, Touch ID, Thunderbolt. Heck, maybe even two ports instead of one. Maybe drop its base config to 128 GB SSD like the 13-inch Air, so it can then start at $1,099. That would make so much more sense. And while you're doing all that, upgrade the 13-inch Air as well, bringing them both to Ice Lake-Y.

Or maybe they'll drop it altogether.

I'm liking the idea of a new MBP with better thermals, a scissor keyboard and maybe a new GPU.

I'd buy it sight unseen if only for the keyboard.
[doublepost=1562349283][/doublepost]
I fully realize you want a thinner and still thinner MacBook Thats fine! Others want a more usable portable MacBook Pro.

Actually, I do need a mobile workhorse. (I also like bringing it with me on the bike, though.)

It needs at least 32 Gigs of RAM, fast storage, and a reasonable CPU. I don't care much about the GPU.

I can get most of that today, but what people say about the keyboard really creeps me out.

Lest you get the impression I'm in the market for an Air, nope.
 
The closest data I could find, headline-wise, would be Bare Feats's "Does the 2019 MacBook Pro 15-inch 2.3GHz 8-core exhibit thermal throttling?".

It's not that great an article, though. Does the DaVinci graph show all four runs? Where did each of them start? Why does the text say "the core frequency dropped as low as 2.39GHz, a tad above the base frequency of 2.3GHz" when it appeared to actually drop to roughly 2.0 GHz, below base? Why doesn't the LuxMark graph show temperature?

I also find their conclusion a little lacking:......

Yeah, it’s hard to deconvolute that. I think, if you want to get to the heart of how much processing speed you’re losing due to the MBP’s thermal constraints, it’s cleaner to instead do the kind of comparison I suggested: Find a system with the same CPU as the MBP, but which is as thermally unconstrained as possible, and compare performance.

Notebookcheck.net did a thermal analysis of the MSI-GE75, which is a 6-pound, .74”-thick 17” laptop with ample cooling (sufficient to support a NVIDIA RTX2080):

https://www.notebookcheck.net/MSI-GE75-Raider-9SG-Core-i9-9880H-RTX-2080-Laptop-Review.420559.0.html

They found that, with Cinebench R15 put into a loop, its Core i9-9880H was able to sustain 3.6 - 3.7 GHz in multithread mode! That shows the minimum the i9-9880H is capable of with good cooling. Alas, I haven’t been able to find an equivalent test for the 2019 MBP with that processor. The closest I came was a test by appleinsider.com of a 2019 MBP with the top-end i9-9880HK processor running Cinebench R20 (instead of R15) in multithread mode:

https://appleinsider.com/articles/1...t-core-2019-macbook-pro-with-vega-20-graphics

Not sure how much difference R15->R20 makes to max sustained multithread clock speed (maybe none, if the limitation is purely thermal), but they found (with this faster processor) that sustained multithread clock speeds were ~3.0 GHz.

So, even if the HK doesn’t have faster sustained multithread clocks than the H (and assuming the version of CB doesn’t matter in this regard), this one comparison suggests (very roughly) a 20% loss of CPU performance due to the MBP’s thermals for multithread loads. That’s actually not too bad, considering how much smaller the 15” MBP is than the 17” MSI.

Where the answer gets trickier is if fewer cores than eight are used and/or turbo gets enabled, both of which you'll frequently see in everyday use (very, very little code out there can make good use of four cores, let alone eight).

A notable exception to this is code written for embarrassingly jobs: Say you have to repeat the same long calculation on each of 1000 different polymers. So, if you have 10 available cores, you'll create 10 programs. The first will run polymers no. 1 to 100, the second 101 to 200, etc. Then you will have all 10 cores (or however many you have available--I once had over 100 cores running simultaneously on my school's largest public cluster during a winter break) fully occupied.
 
Last edited:
it’s cleaner to instead do the kind of comparison I suggested: Find a system with the same CPU as the MBP, but which is as thermally unconstrained as possible, and compare performance.

Makes sense.

So, even if the HK doesn’t have faster sustained multithread clocks than the H (and assuming the version of CB doesn’t matter in this regard), this one comparison suggests (very roughly) a 20% loss of CPU performance due to the MBP’s thermals for multithread loads. That’s actually not too bad, considering how much smaller the 15” MBP is than the 17” MSI.

Interesting.

A notable exception to this is code written for embarrassingly jobs: Say you have to repeat the same long calculation on each of 1000 different polymers. So, if you have 10 available cores, you'll create 10 programs. The first will run polymers no. 1 to 100, the second 101 to 200, etc. Then you will have all 10 cores (or however many you have available--I once had over 100 cores running simultaneously on my school's largest public cluster during a winter break) fully occupied.

But those are rare, and even then, there comes a time where you have to synchronize the results, at which point you may as well have just one core. (Good thing Turbo Boost is a thing now, unlike at your school cluster.)

But sure, heavily parallelized workloads are a thing.
 
Compared to DDR4 chips, no, but compared to LPDDR which most MacBook models use, yes it does.
You might have mentioned it earlier, but in addition to using less power (specifically, in standby) LPDDR3 must be soldered; it’s only available as multi chip package/package on package. There’s no socketed LPDDR3 (or LPDDR4, either) such as SODIMMs
 
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.