Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
So what do you all think is more likely — that we might see a price drop this year, or that they will retain same pricing but increase base storage size?
- Price drop, maybe by $100
- improved butterfly keyboard
- Maybe a 512 GB on the entry level model for the 15 inch, 256 on 13 inch, but they might actually increase the price by $100 to justify. As always, Apple likes to boast how much faster their flash storage is.
- Entry level touch bar 13 inch might get a bump to 16 GBs by default.
- non-touch bar 15 inch model.
- T2 chip for secure boot
- Intel Coffee lake chips
- Improved AMD graphics
- Possible Intel/AMD Vega graphics for the 13 inch model
- Return of the backlit Apple logo
 
- Price drop, maybe by $100
- improved butterfly keyboard
- Maybe a 512 GB on the entry level model for the 15 inch, 256 on 13 inch, but they might actually increase the price by $100 to justify. As always, Apple likes to boast how much faster their flash storage is.
- Entry level touch bar 13 inch might get a bump to 16 GBs by default.
- non-touch bar 15 inch model.
- T2 chip for secure boot
- Intel Coffee lake chips
- Improved AMD graphics
- Possible Intel/AMD Vega graphics for the 13 inch model
- Return of the backlit Apple logo

256GB / 16GB is the ideal 13” setup that’s I want.

Hopefully 256GB becomes standard in the Touch Bar mode c’mon it’s 2k for that here.
 
  • Like
Reactions: 2018Ram
I'm going to bet on U-series quad-core CPUs w/ IRIS graphics on the 13". 15" with Intel i7-8705G/vega. Not sure if they will do hexa-core without a redesign due to heat.
 
256GB / 16GB is the ideal 13” setup that’s I want.

Hopefully 256GB becomes standard in the Touch Bar mode c’mon it’s 2k for that here.
256 is standard for the TB model, it’s the nTB model that has a variant starting at 128 (and that was introduced only in 2017 at a lower price point). I believe flash memory prices are now starting to slowly come down from their peak late last year, but they have a long way to go to get back to even where they were in 2015...
 
  • Like
Reactions: RandomDSdevel
256 is standard for the TB model, it’s the nTB model that has a variant starting at 128 (and that was introduced only in 2017 at a lower price point). I believe flash memory prices are now starting to slowly come down from their peak late last year, but they have a long way to go to get back to even where they were in 2015...

NAND stock is in a all time high.

I meant nTB my bad. I do want the TB Model due to the no throttling problems like the NTB and etc..
 
  • Like
Reactions: RandomDSdevel
I'm going to bet on U-series quad-core CPUs w/ IRIS graphics on the 13". 15" with Intel i7-8705G/vega. Not sure if they will do hexa-core without a redesign due to heat.

Hmmm..if they actually do Vega on the 15", I will have to reconsider my decision to purchase the 13".
 
  • Like
Reactions: RandomDSdevel
I'm going to bet on U-series quad-core CPUs w/ IRIS graphics on the 13". 15" with Intel i7-8705G/vega. Not sure if they will do hexa-core without a redesign due to heat.

Hmmm..if they actually do Vega on the 15", I will have to reconsider my decision to purchase the 13".

If they use the i7-8709G/Vega in the nTB 15" (2015 body, ports + TB3 and magsafe, updated screen, redesigned cooling) would be perfect....a lot better than the 2016/2017 TB 15". I would actually be tempted to buy it...

If they only wanted they could fit 100W TDP in that, more or less same CPU performances and a lot better GPU performances...
 
  • Like
Reactions: RandomDSdevel
I interpret the calls to stop shipping 128GB of storage at the low end as a call to make 256GB of storage the low-end config at the same price as 128GB, and correspondingly decrease the price of 512, 1TB, 2TB, etc. Everyone benefits from more storage for their buck.







Keep in mind the 570/580 and 570X/580X are likely to be far too power hungry to fit in a MBP. If Vega isn't ready in time, I expect the highest end GPU to be a 560X, much like the current highest end is a 560.



A mobile Vega with about 12-16 CUs (Compute Units) should provide the best possible performance within the MBP's power budget. If it's ready in time, and that's also if AMD is creating a mobile Vega of that size.

The current big Vega chip is 64 CUs (the cheaper 56 CU version is the same chip with 8 CUs disabled.) AMD implied that they were working on one mobile Vega chip for this year. If AMD feels that the desktop replacement / gaming performance laptops is a more lucrative market, it's possible that mobile Vega is being designed with 24-32 CUs, which would be a very compelling competitor to Nvidia's mobile 1060/1070 chips, but likely would not be a fit for the MBP.
Oh yah I think I got the numbers messed up. I meant to say 450 and 560. Cuz I think the max card in the 2016 was a 460 not 480 right? I’m so used to Nvidia name conventions so my bad!
[doublepost=1527506506][/doublepost]
This isn't really something you'd want to do, for a number of reasons.

From a hardware perspective, when two or more CPU's exist on the same bus and competing for the same resources (like memory, PCI lanes, etc.) they will generally conflict with each other unless they have some protocol to negotiate access to the resources. Or just for cache coherency etc. Xeon processors (for example) can do this between each other. I'm pretty sure they cannot do this with an ARM processor. The usual Core processors can't do this at all. So maybe you could set up some heterogeneous system for coprocessors similar to how a GPU might work, but then you may also end up having to do dedicated memory, or have a nightmare with sharing. Nightmare also meaning that it will be far too slow to be of any practical use.

Then from the software side, you're going to run into a problem very quickly if the OS is another architecture than the app itself. With UNIX and friends, apps need access to the part of the OS where the syscalls are implemented so that this code can be executed from user space processes. On Linux this is achieved my mapping part of the kernel into every user process. Don't know how it's done on Darwin, but wouldn't be surprised if it's something similar. This is of course for performance reasons, syscalls get really slow if you need to context switch as part of each call. Now, what do you think happens if the user space app is executing on the Intel processor, maps ARM syscall code into its memory space, and then starts executing the ARM code? It can't. One might maybe imagine some other way to interface with the OS that allows for this to happen, but it would end up being painfully slow.

But this is all ignoring the fact that it's not really needed. The OS is lightweight enough that it doesn't need to be moved off to a separate CPU.

There's much more that could be said, but the point is that it would be extremely tricky to get it to work at all, and you wouldn't achieve anything desirable with this model. There are probably other ways to achieve something with similar effect though.
Thanks for that concise answer. I really appreciate it.

Do you know why people were saying on MacRumors and 9to5Mac that Apple was planning an Intel processor with an arm coprocessor handling low power then? This is not me being a jerk I’m just really curious why that’s possible but my theoretical 3 processor idea isn’t. I’m really curious because to someone not well learned on the subject it seems all too similar because one of the processors has to know whether to send the program to the other processor.

Honestly those rumors might have been stupid becuase it isn’t actually possible but I feel I must be missing something.
 
  • Like
Reactions: RandomDSdevel
I'm going to bet on U-series quad-core CPUs w/ IRIS graphics on the 13". 15" with Intel i7-8705G/vega. Not sure if they will do hexa-core without a redesign due to heat.

Quad Core Iris on the 13" is exactly up on my alley.

As much as I like the 15" and having a dGPU.. I can't get in and jump on the Raedon. I don't wanna game on the laptop and tbh I have a Switch always with me..

If they had a NVIDIA 1060 I would have reconsider my purchase and jump into the 15" via bootcamp for the ocasional Civ game thou
 
I know it doesn't belong to the new MBP and apple can't be blamed for LPDDR4 not being available but it's still fitting with the negligence of the mac lineup.

https://www.zerohedge.com/news/2018-05-27/1995-steve-jobs-explained-exactly-how-apple-will-fail

He was not only a visionary, but a technological field prophet too....
I think he is absolutely right.
Compare Apple of 2010, with Apple in 2018, especially in Mac computers.
Don't tell me about just 'numbers', and sales, and stock market shares etc.
Quality, innovation etc, are all rather declining...
 
I know it doesn't belong to the new MBP and apple can't be blamed for LPDDR4 not being available but it's still fitting with the negligence of the mac lineup.

https://www.zerohedge.com/news/2018-05-27/1995-steve-jobs-explained-exactly-how-apple-will-fail
Good find indeed. This is why I'm excited by the Apple discussions about workflow and the rumours of a possible ARM mac or hybrid. Product development over marketing. But in the meantime, I wouldn't mind a 13" tbMBP. Screen with wider gamut would always be welcome and I'm certain Apple will have butterfly keyboard v3. Would they integrate a T2 chip like in the iMac Pro?
 
  • Like
Reactions: RandomDSdevel
What? No lol. Obviously not. The rest of the list makes sense though.
It's not as far fetched as it seems.
[doublepost=1527516501][/doublepost]
I'm going to bet on U-series quad-core CPUs w/ IRIS graphics on the 13". 15" with Intel i7-8705G/vega. Not sure if they will do hexa-core without a redesign due to heat.
Of course, it was evident that they were going to use the quad-core U-series in 2016. They may have models that use the G-series and models that use the H-series, but they will be using the hexa-core chips.
 
  • Like
Reactions: RandomDSdevel
Thanks for that concise answer. I really appreciate it.

Do you know why people were saying on MacRumors and 9to5Mac that Apple was planning an Intel processor with an arm coprocessor handling low power then? This is not me being a jerk I’m just really curious why that’s possible but my theoretical 3 processor idea isn’t. I’m really curious because to someone not well learned on the subject it seems all too similar because one of the processors has to know whether to send the program to the other processor.

Honestly those rumors might have been stupid becuase it isn’t actually possible but I feel I must be missing something.
Well, think of something like the touch bar and the T1 chip, which is a SoC not entirely different from what's in the Apple Watch. It's not too far from being its own independent subsystem. It's not that far fetched to imagine something similar with its own very small OS that can handle very simple tasks like waking up the main computer if it sees certain network packets. And I'm sure they'd be doing more than that if the rumours were indeed true.

Or if you wanted to imagine a SoC that is effectively an iPad on a chip, that borrows display, speakers etc from the MBP, then that would probably work fairly well too. That SoC would run iOS, the Intel chip would run macOS, and they wouldn't be scheduling apps between them.

Both of these examples are pretty similar to how a dGPU works. When you want to run a compute job on the GPU, you have to send it the program (called a kernel, just to keep things as confusing as possible). Then you send it the data (copy from main RAM to GPU RAM). Then you tell the GPU to run the program on that data, and possibly write output data to somewhere else. Then the output data has to be copied back to the main RAM. The GPU is effectively a coprocessor, but it's not SMP as when you have a dual socket workstation. They don't (typically) share RAM with the main CPU.

The key here is data sharing. If you want two processors to have shared access to the same data, then they have to communicate on how they access that data, and that makes it more complex. Probably also a bit slower. At the same time it's pretty convenient to share data and not have to spend time copying it from one chip to another. Engineering tradeoffs...

Now engineers sometimes come up with ingenious ways of creating magic. If you remember back in the 80's, the Amiga had one main CPU and several coprocessors that could indeed work independently with main RAM and other resources. This was, if I remember correctly, achieved by clocking the coprocessors at a fixed multiple of the main CPU, and then phase shifting the clock so they never actually conflicted with each other. Which was pretty neat for the original box, but fairly inconvenient later when they wanted to bump the clock speed of the CPU independently. And this was the early 80s mind you. What's possible or not possible is constantly in motion. :)

Anyway, while many weird things get written by people who don't know tech, the low power chip idea was probably fairly feasible.
[doublepost=1527521949][/doublepost]And just to keep this at least remotely on topic -- I do think it would be pretty cool if Apple would drop in a cheap ARM coprocessor in the MBP. Not just for sending mail in its sleep, but also for iOS app development, or just being able to run iOS apps on the MBP at full speed. I quite doubt we're going to see that this year though.
 
It does indeed. I would love to see a non-TB, dual fan 13" model with a dGPU. But alas, that will never happen.
Maybe it will happen, in a week! I believe Apple is going to remove the TB, or at least make it optional. The fact that the TB model has more fans though, really made me change my mind on buying the base model.
 
  • Like
Reactions: RandomDSdevel
He was not only a visionary, but a technological field prophet too....
I think he is absolutely right.
Compare Apple of 2010, with Apple in 2018, especially in Mac computers.
Don't tell me about just 'numbers', and sales, and stock market shares etc.
Quality, innovation etc, are all rather declining...

If Steve Jobs died in 2003, then I would expect an eventual implosion at Apple. But I think one of things Steve Jobs instituted was a policy of hiring the best people who shared his values at the top to the bottom. There are a lot of them at Apple. He did say Jony Ive is his spiritual partner; he did hand pick Tim Cook.

But I think the problem with Apple is not necessarily the products themselves, but the execution. The disconnect between USB C and Lightening enabled devices; Siri still not up to standard with Alexa and Google assistant; unfinished products like the HomePod and even the MacBook Pro's butterfly keyboard.

There is also a lot of consistency thats missing; and I don't know if its the lack of human engineering resources on each product or the lack of a micro-manager to ensure the quality. But, if there are several things thats problematic at Apple its: execution, delivery and integration.

Another factor why these things are possibly happening is due to the company trying to set themselves up for the next couple decades. If you look at where Apple is investing a lot of its engineering and design, its in: Wearables, Services, Automation and AI (which is the weakest link). Many here don't want to hear it, but the iPhone, iPad, Mac are legacy.

Steves Jobs passion was always to find whats next. If you think Steve Jobs was still using an iPod Classic when the iPhone was released, think again. If you read the Walter Isaacson biography, when he went on vacation to Hawaii, he synced all his movies to an iPad.

So, Steve Jobs would have been OK with some of the decisions today but his ultimate aim was to remove a lot of complexity and replace whatever he created with something better.

Going back to decline: companies like Apple, Google, Facebook and even Microsoft are here to stay. They are the new IBM's. 100 years from now, people will still be talking about all four - they might be doing completely different things of course. They have latched on to a level of success and wealth that makes them so intertwined into our society and daily lives. Not to mention, their assets and investments are so diverse, the they can fail a hundred times over and still recover.
 
  • Like
Reactions: RandomDSdevel
I don't mind the TB living on but only in addition to physical F keys. But spending any time debugging in Xcode with the TB is an exercise in absolute frustration as you often need to hit F6 or F7 repeatedly and missing it just once can throw your whole debug session right off.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.