Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I don't totally disagree they need 10nm sooner than later. However, its not intel's fault the latest version of the MacBook Pro is as thin as an ultrabook. At its thickest, the MacBook is roughly the same thickness. I mean, why? The marginal 1/6th of an inch in the 2015 vs 2016 design just had no benefit and only costs.
In fairness, it's also not Apple's fault that when they were designing the TB models Intel was insisting that their chip roadmap was towards cooler running CPUs while in actual fact they've ended up delivering significantly hotter running ones that don't stick to their TDP nicely like they had done up to 2017. I'm pretty sure the TB models are designed the way they are because Apple were expecting at least iterations 3&4 to be using cooler 10nm chips, and initially probably thought only the 2016 models would be using 14nm chips (that actually do work OK inside of these machines). Now you could make the argument that the 2018 minor redesign should have gone further, and I am inclined to agree (particularly taking the complete failure of the keyboard into consideration) but rolling out a completely new design takes quite a bit of planing ahead, and intel were clinging on to cannonlake for a long time, probably assuring Apple that 10nm was just around the corner.
 
  • Like
Reactions: afir93 and Aquamite
@Falhófnir

Lot of your writing is speculation. Maybe true, maybe not. We can't really know.
But what we do know is that Apple once again put themselves into thermal corner. So to speak.

And Intel didn't force Apple to put i9 chips into MBP for example. i9 performs even worse then i7 in MBP 2018. And you have to pay extra to get that i9. Even Razer avoids i9, and they have bulkier laptops with way better cooling solution then Apple has.
 
If it could be thinner and lighter, I would be like hell yeah.

IMHO, the 15 is too big and heavy. I find the 13 much better. Now if they could get rid of the bezels and turn it into a 14.5 while keeping the weight the same.....
yes love the 15, its just too heavy and large for me to take around daily...
 
It seems most of you guys think Apple will stick to Vega graphics for the 15”. Makes sense... but at the same time, AMD is rumored to reveal their new Navi chips any day now (latest rumor is at E3 in June). If the rumors are true about Navi cards, we could have a dGPU with much lower power consumption and real time ray tracing abilities. That would definitely make the MacBook Pro far more usable for graphics heavy applications. Not to mention, having the real time ray tracing would give us some future proofing.

As good as Vega is, I really hope Apple can get those Navi chips in.
 
It seems most of you guys think Apple will stick to Vega graphics for the 15”. Makes sense... but at the same time, AMD is rumored to reveal their new Navi chips any day now (latest rumor is at E3 in June). If the rumors are true about Navi cards, we could have a dGPU with much lower power consumption and real time ray tracing abilities. That would definitely make the MacBook Pro far more usable for graphics heavy applications. Not to mention, having the real time ray tracing would give us some future proofing.

As good as Vega is, I really hope Apple can get those Navi chips in.
I had originally heard Navi for mobile was more of a Q3/Q4 thing. Hope you are right tho.
[doublepost=1556380368][/doublepost]
@Falhófnir

Lot of your writing is speculation. Maybe true, maybe not. We can't really know.
But what we do know is that Apple once again put themselves into thermal corner. So to speak.

And Intel didn't force Apple to put i9 chips into MBP for example. i9 performs even worse then i7 in MBP 2018. And you have to pay extra to get that i9. Even Razer avoids i9, and they have bulkier laptops with way better cooling solution then Apple has.
Seriously i hope you are not trying to apologize for the complete innovation disaster at Intel. 10nm was supposed to be YEARS ago. That's a fact- Falhófnir was not speculating one bit on that.
 
  • Like
Reactions: Falhófnir
It seems most of you guys think Apple will stick to Vega graphics for the 15”. Makes sense... but at the same time, AMD is rumored to reveal their new Navi chips any day now (latest rumor is at E3 in June). If the rumors are true about Navi cards, we could have a dGPU with much lower power consumption and real time ray tracing abilities. That would definitely make the MacBook Pro far more usable for graphics heavy applications. Not to mention, having the real time ray tracing would give us some future proofing.

As good as Vega is, I really hope Apple can get those Navi chips in.
I think Navi will be first launched for desktop having so you’ll have to wait more for the mobile version suitable for the MBP.

Vega graphics is way more realistic of a bet for the next MBP. Maybe Vega VII for mobile whenever it launches for mobile if it isn’t already out.
 
Seriously i hope you are not trying to apologize for the complete innovation disaster at Intel. 10nm was supposed to be YEARS ago. That's a fact- Falhófnir was not speculating one bit on that.

Never claimed that Intel didn't fail, because they have, and they failed really bad.
I was talking about other speculations in his post, I thought that was obvious.
 
@Falhófnir

Lot of your writing is speculation. Maybe true, maybe not. We can't really know.
But what we do know is that Apple once again put themselves into thermal corner. So to speak.

And Intel didn't force Apple to put i9 chips into MBP for example. i9 performs even worse then i7 in MBP 2018. And you have to pay extra to get that i9. Even Razer avoids i9, and they have bulkier laptops with way better cooling solution then Apple has.
Really the only part I think you can fairly call speculation is whether Apple were still listening to Intel while making changes to the 2018 machine. The rest is based off of what Intel themselves were publicly stating (and before you say that wasn't necessarily what they were saying to Apple behind the scenes; it's extremely unlikely they were publicly saying 10nm/ cannonlake was coming and then telling people at Apple actually it wasn't. That would pretty much get them rinsed for Securities Fraud at the very least). The first point at which Intel publicly acknowledged 10nm wasn't happening (at least not in the way they had originally announced) was just before Brian Krzanich resigned in June 2018 - just a little before the MBPs launched.

Basically I don't think there can be any doubt that Apple initially thought these machines would be getting cooler running chips when they designed them. They are adequately built to handle the Skylake and Kaby Lake CPUs that Intel had mapped out by early 2016. They actually run cooler than the preceding retina generation machines (in part due to the move from 22nm Haswell chips). Where things come unstuck is with Coffee Lake, which we know runs warmer due to additional cores. But with Intel putting out CPUs that were marketed at a the same nominal TDP really there should have been no issue using them in the MBP, just like the previous two generations. So yeah, I do think it's fair to say this is Intel's screw up in the first place.

Now, I'm not trying to completely exonerate Apple from any blame here, I've already said I think they screwed up with the 2018 by not making more drastic changes, once they got the chips and saw they ran hotter, and no, they probably shouldn't have put the i9 in there. But I think it's fair to recognise this put them in a really awkward position, and they weren't the only ones who got caught out here either. Dell's XPS also struggles to cool an i9, yet the option exists for that machine too. Remember the i9 was basically marketed to replace the previous top end i7-HK, they needed something to slot in at that level. It's Intel that basically marketed them as drop-in replacements for previous gen chips when that wasn't actually the case. Yes Apple screwed up here too, but for a board under extreme pressure to perform financially, not releasing anything at all was probably out of the question too.
 
  • Like
Reactions: Aquamite and afir93
It seems most of you guys think Apple will stick to Vega graphics for the 15”. Makes sense... but at the same time, AMD is rumored to reveal their new Navi chips any day now (latest rumor is at E3 in June). If the rumors are true about Navi cards, we could have a dGPU with much lower power consumption and real time ray tracing abilities. That would definitely make the MacBook Pro far more usable for graphics heavy applications. Not to mention, having the real time ray tracing would give us some future proofing.

As good as Vega is, I really hope Apple can get those Navi chips in.

- AMD will first release mid range desktop variants of Navi.
- So far, there are no rumors about Navi mobile and I don’t even think AMD will release mobile versions at all.
- Navi is specifically designed for gaming and therefore not suitable for MacBooks. - - - Vega is better at computing when compared to Navi.
- Ray tracing will only come to Navi 20 (big Navi) in 2020!
 
In future there will no longer be need for high GPU... for any gamers.. Google Stadia will be the thing ..



So apple going to ARM is more what will come in future..
 
In future there will no longer be need for high GPU... for any gamers.. Google Stadia will be the thing ..



So apple going to ARM is more what will come in future..
Some people will no longer need a high-end GPU for gaming, sure. This looks like a great way to play games for people with fast enough internet speeds. But for the rest of us, it won't really be a good option for a long time. I'm sitting here in Germany with a 50 Mbit/s connection that rarely ever reaches that maximum and hovers more around 20-30 Mbit/s most of the time; and 50 Mbit/s is the very maximum that a lot of private households in cities have here.

These 20-30 Mbit/s is a speed where I can just (barely) stream a 4k30FPS YouTube or Netflix video without any stuttering if nobody else is doing any significant downloads/streaming on the same network, and that's with their relatively high compression algorithms. Video games generally require much higher bitrates in order to look great, and that's without even talking about 60FPS (or even higher framerates), HDR, VR headsets (that run many games at 120FPS standard), or as @NBAasDOGG pointed out, input lag which may make a lot of games that require quick reactions unplayable/unenjoyable. I once tried the PS4's Shareplay feature with a friend, it was not a great experience.

And many households on the countryside here in Germany are still stuck with 16 MBit/s connections. Good luck even playing anything above 1080p with this.

And that's without even talking about what this means for modding communities, or for developers who, for one reason or another, either cannot or don't want to bring their games to such a streaming service...

Don't get me wrong. A lot of the things in the video look really promising. Being able to immediately jump into games without waiting for hours or days to install them, being able to seamlessly switch between devices Handoff-style, playing games even on mobile devices or on computers without a strong enough GPU... sure, sounds great. But for a lot of us, it will be many years before we'll have the internet speeds for this to replace conventional PC/console gaming.
 
Some people will no longer need a high-end GPU for gaming, sure. This looks like a great way to play games for people with fast enough internet speeds. But for the rest of us, it won't really be a good option for a long time. I'm sitting here in Germany with a 50 Mbit/s connection that rarely ever reaches that maximum and hovers more around 20-30 Mbit/s most of the time; and 50 Mbit/s is the very maximum that a lot of private households in cities have here.

These 20-30 Mbit/s is a speed where I can just (barely) stream a 4k30FPS YouTube or Netflix video without any stuttering if nobody else is doing any significant downloads/streaming on the same network, and that's with their relatively high compression algorithms. Video games generally require much higher bitrates in order to look great, and that's without even talking about 60FPS (or even higher framerates), HDR, VR headsets (that run many games at 120FPS standard), or as @NBAasDOGG pointed out, input lag which may make a lot of games that require quick reactions unplayable/unenjoyable. I once tried the PS4's Shareplay feature with a friend, it was not a great experience.

And many households on the countryside here in Germany are still stuck with 16 MBit/s connections. Good luck even playing anything above 1080p with this.

And that's without even talking about what this means for modding communities, or for developers who, for one reason or another, either cannot or don't want to bring their games to such a streaming service...

Don't get me wrong. A lot of the things in the video look really promising. Being able to immediately jump into games without waiting for hours or days to install them, being able to seamlessly switch between devices Handoff-style, playing games even on mobile devices or on computers without a strong enough GPU... sure, sounds great. But for a lot of us, it will be many years before we'll have the internet speeds for this to replace conventional PC/console gaming.


Video games are more about latency than bandwidth.
But you’re not totally wrong, this will mean that both will be crucial at higher resolutions.
 
Video games are more about latency than bandwidth.
But you’re not totally wrong, this will mean that both will be crucial at higher resolutions.
I wouldn't really say that it's more about one than the other – the industry is moving towards 4K at 30/60FPS + HDR in games, so being stuck at 1080p30FPS or even a sub-1080p resolution, no HDR for those whose bandwidth just doesn't support more would be a pretty big regression, and doesn't exactly sound like the "future" of gaming. Don't think you would be very happy playing games at 720p or even lower on your 4K HDR/Dolby TV, maybe in a couple years 5K or 8K TV, and having them become unplayable completely if someone else in your household wants to watch videos or download some larger files.

But yeah, like I said in my post, latency is equally a very big problem, I fully agree about that. Even relatively small input delays can be a big annoyance and make any kind of games that require quick reaction times unplayable.
 
So the best thing in this time frame to do is just wait out the 2019 pro announcement it seems... I am on the verge to buy a macbook pro 2018 but since the keyboard issues are too scary for me to pickup a second hand machine...

I will have to wait to see if the 2019 version has improved keys i guesssss... the rest of the hardware specs are a bonus in that case, as i am too worried picking up a secondhand macbook pro 2018 with 1/2 year apple care and have a expensive brick after 2 years off use if the keyboard fails... at that point not going to invest 600+ USD for a laptop repair..
[doublepost=1556497575][/doublepost]
I wouldn't really say that it's more about one than the other – the industry is moving towards 4K at 30/60FPS + HDR in games, so being stuck at 1080p30FPS or even a sub-1080p resolution, no HDR for those whose bandwidth just doesn't support more would be a pretty big regression, and doesn't exactly sound like the "future" of gaming. Don't think you would be very happy playing games at 720p or even lower on your 4K HDR/Dolby TV, maybe in a couple years 5K or 8K TV, and having them become unplayable completely if someone else in your household wants to watch videos or download some larger files.

But yeah, like I said in my post, latency is equally a very big problem, I fully agree about that. Even relatively small input delays can be a big annoyance and make any kind of games that require quick reaction times unplayable.

Lots of ISPS have a full monopoly. they wont upgrade existing infras for another 10~20 years.
It is already the bottleneck for many countries especially outside the bigger cities, even in the USA its like that.

We can buy a 4K monitor and 4K capable PC/PS4 etc for dollars and dimes already relatively speaking. But most of all that glorious stuff is rendered client side, but all the individual moving pieces of players and effects they let appear are have to be send back and forth to all players in that proximity.

The more stuff they add in games that are triggered by players the harder it become to play those over existing infrastructures, as there will be more and more packet congestion leading to drop/ping.

People stream everything these days consuming much more bandtwith then the old pirates days. (50 hardcore download/upload peers back in the days vs 5000 users hardly using their internet during most of the day.

Now its 5000 casuals who stream everything now days. lots more data being send back and forth.

because everyone has netflix these days it seems. ISPs are prioritizing data on which they own the most money... and those are their own streaming services or partners.

So games will run less good over the internet due QOS. ISPS do still have plenty leverage, but our way of consuming data over the internet is much higher then before thus has a negative impact on online gaming.
 
Last edited:
So the best thing in this time frame to do is just wait out the 2019 pro announcement it seems... I am on the verge to buy a macbook pro 2018 but since the keyboard issues are too scary for me to pickup a second hand machine...

I will have to wait to see if the 2019 version has improved keys i guesssss... the rest of the hardware specs are a bonus in that case, as i am too worried picking up a secondhand macbook pro 2018 with 1/2 year apple care and have a expensive brick after 2 years off use if the keyboard fails... at that point not going to invest 600+ USD for a laptop repair..

I would definitely wait at this point. I would guess an update/upgrade is right around the corner.
 
  • Like
Reactions: alongdingdong
I would definitely wait at this point. I would guess an update/upgrade is right around the corner.

Yeah, well have to do that then... Kinda sucky because i have a 2 month holiday between now and moving back to my home country, so was hoping to be more productive in the video editing department this 2 month.

Will buy a all new 13 inch MBP in that case, just going to be a bit more autchie in the price dept because now i still live in hong kong and its just cheaper here.
 
This is a more than a bit off base. They have caught up, if you compare 8 core A12X to 4 core i5s or 4 core i7s from 2 years ago on synthetic benchmarks.

First, they're not really 8 core. Check out big.Little. Either you use the 4 high-end cores, or you use the 4 efficient ones.

Second, the A12X goes 20% slower "while being clocked 40% lower, while using a fraction of the power". In other words: you can get 20% of the power without a fan, whereas you can use the i7 to fry eggs or to light your cigar.

Does that mean everything is all equal, and that the ARM chips will smoke an i9 9900K NOW? No. You're right in that respect. But looking at the speed of improvement of the ARM, and that of Intel, it's hard not to anticipate ARM overtaking Intel pretty soon.

These ARMs are also very likely to be tuned in ways that favor good scores on short duration and shallow data depth benchmarks.What happens when ARM needs to support more complicated systems (managing more IO is one area that jumps out), deeper/wider data, and longer running tasks?

That's precisely the sort of myth that the article dispells. Have you read it? It's quite interesting.
 
  • Like
Reactions: afir93
First, they're not really 8 core. Check out big.Little. Either you use the 4 high-end cores, or you use the 4 efficient ones.

Second, the A12X goes 20% slower "while being clocked 40% lower, while using a fraction of the power". In other words: you can get 20% of the power without a fan, whereas you can use the i7 to fry eggs or to light your cigar.

Does that mean everything is all equal, and that the ARM chips will smoke an i9 9900K NOW? No. You're right in that respect. But looking at the speed of improvement of the ARM, and that of Intel, it's hard not to anticipate ARM overtaking Intel pretty soon.



That's precisely the sort of myth that the article dispells. Have you read it? It's quite interesting.
The A12X is not bigLitte. It’s a true 8 core SoC.
 
The A12X is not bigLitte. It’s a true 8 core SoC.
Well, if that's the case, I misinterpreted the two types of cores as described on Wikipedia:
The A12X features an Apple-designed 64-bit ARMv8-A octa-core CPU, with four high-performance cores called Vortex and four energy-efficient cores called Tempest.[3][1]The Vortex cores are a 7-wide decode out-of-order superscalar design, while the Tempest cores are a 3-wide decode out-of-order superscalar design.
The point of the article still stands. If you compare real workloads, the ARM is surprisingly comparable despite the fact that it can be passively cooled. Now let's see if it can be scaled up to an actively cooled processor and wait for the results.
 
Has anyone just considered upgrading when Intel is on second gen 10 NM and LPDDR5 RAM is available mainstream in 2021?

Right now is such an unknown and the plans they are making regarding bringing iOS apps to macOS has me concerned.
 
I am getting really depressed. WWDC is basically a month away. That would be the most likely time Apple would release a "new" Macbook Pro.

But if they were going to do anything other than minor spec bumps, wouldn't we have had some leak by now? Wouldn't there have been a leak of an aluminum chassis, or some report on a new touchbar format, or.... SOMETHING?!

I am waiting to upgrade a 2013 MBP. The lack of any leak at all relating to MBP makes we wonder what the heck I am waiting for.
 
The lack of any leak at all relating to MBP

I didn't think that the MBP was water resistant :D.

I think some good bets for the next design release for the MBP are

- Same butterfly keyboard with more so called improvements.

- Faster and runs hotter.

- Bigger screen

- Same T2 chip with matching bridgeOS errors.

If you need a MBP now and don't absolutely need the slightly bigger screen, I would get the 2018 MBP or try to find a 2017 MBP without the T2.

In any case, Apple released the 2018 MBP last fall so it's doubtful that we will see a new one until fall of this year at the earliest (i.e. no new MBP at the WWDC and probably no mention of one).
 
  • Like
Reactions: Painter2002
I didn't think that the MBP was water resistant :D.

I think some good bets for the next design release for the MBP are

- Same butterfly keyboard with more so called improvements.

- Faster and runs hotter.

- Same T2 chip with matching bridgeOS errors.

The butterfly keyboard is at first a bit confusing, but as long as it works it´s fine.

Faster is always better, the MY18 runs already at 100°C, if they Push it even higher,They can advertise the fry function of the mbp.

BTW:If you use the mbp MY18 on your lap you can already fry your lap.

Currently the T2 error is the Most annoing thing. Hopefully they´ll fix it with an update.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.