Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Maybe it will happen, in a week! I believe Apple is going to remove the TB, or at least make it optional. The fact that the TB model has more fans though, really made me change my mind on buying the base model.

I think they will evolve it. Possibly a scrubbing bar down along the side of the keyboard. Possibly a fully touch screen keyboard although I very much hope not!
 
  • Like
Reactions: RandomDSdevel
Maybe it will happen, in a week! I believe Apple is going to remove the TB, or at least make it optional. The fact that the TB model has more fans though, really made me change my mind on buying the base model.
Making it an option is one thing I'm sure they won't do. Removing the nTB seems possible to me, dropping the TB much less likely (too much like admitting an error), but making it an option, allowing people to actually prove them wrong about its popularity by making it possible to directly compare TB and nTB sales with no confounding variables? That seems the least likely of all to me - even more so if they were to not artificially price them the same.
 
Oh, the TB model has actually more fans? I didn't have a clue about this! I don't like the touchbar, I prefer much more the actual keys, but more fans is very important to be honest!

Yeah, the reason you see so many people complaining on the entry level Pro and the Dell XPS 13" regarding underclock & thermal throttle is exactly because of the 1 fan design.

If you do WORK with a machine and use VMs or whatever but don't want a 15" always go for the TB. The nTB also has worse Wi-Fi module.

Making it an option is one thing I'm sure they won't do. Removing the nTB seems possible to me, dropping the TB much less likely (too much like admitting an error), but making it an option, allowing people to actually prove them wrong about its popularity by making it possible to directly compare TB and nTB sales with no confounding variables? That seems the least likely of all to me - even more so if they were to not artificially price them the same.

I could see them dropping the nTB and have a 1 fan entry model with TB.. This way you can easily increase the Touch Bar userbase to convince devs to work for it..

Even thou BetterTouchTool is the BEST & MOST AMAZING app out there, because you can adapt the TB to yourself..
 
But I think the problem with Apple is not necessarily the products themselves, but the execution.
Hmm, that doesn't make much sense. The execution is part of the product. Maybe you mean the problem is not the ideas?
 
Hmm, that doesn't make much sense. The execution is part of the product. Maybe you mean the problem is not the ideas?

I honestly think the problem is that the team behind the Macs don't have a "normal" notion of the daily usage that a person gives to this machine.

If you think about it, this theory does make sense... Especially when Apple said that they got tons of professionals on board to give input regarding the modular Mac Pro and so on.. Their target audience is so confusing that nobody really understands who this computers are for at the end of the day
 
Well, think of something like the touch bar and the T1 chip, which is a SoC not entirely different from what's in the Apple Watch. It's not too far from being its own independent subsystem. It's not that far fetched to imagine something similar with its own very small OS that can handle very simple tasks like waking up the main computer if it sees certain network packets. And I'm sure they'd be doing more than that if the rumours were indeed true.

Or if you wanted to imagine a SoC that is effectively an iPad on a chip, that borrows display, speakers etc from the MBP, then that would probably work fairly well too. That SoC would run iOS, the Intel chip would run macOS, and they wouldn't be scheduling apps between them.

Both of these examples are pretty similar to how a dGPU works. When you want to run a compute job on the GPU, you have to send it the program (called a kernel, just to keep things as confusing as possible). Then you send it the data (copy from main RAM to GPU RAM). Then you tell the GPU to run the program on that data, and possibly write output data to somewhere else. Then the output data has to be copied back to the main RAM. The GPU is effectively a coprocessor, but it's not SMP as when you have a dual socket workstation. They don't (typically) share RAM with the main CPU.

The key here is data sharing. If you want two processors to have shared access to the same data, then they have to communicate on how they access that data, and that makes it more complex. Probably also a bit slower. At the same time it's pretty convenient to share data and not have to spend time copying it from one chip to another. Engineering tradeoffs...

Now engineers sometimes come up with ingenious ways of creating magic. If you remember back in the 80's, the Amiga had one main CPU and several coprocessors that could indeed work independently with main RAM and other resources. This was, if I remember correctly, achieved by clocking the coprocessors at a fixed multiple of the main CPU, and then phase shifting the clock so they never actually conflicted with each other. Which was pretty neat for the original box, but fairly inconvenient later when they wanted to bump the clock speed of the CPU independently. And this was the early 80s mind you. What's possible or not possible is constantly in motion. :)

Anyway, while many weird things get written by people who don't know tech, the low power chip idea was probably fairly feasible.
[doublepost=1527521949][/doublepost]And just to keep this at least remotely on topic -- I do think it would be pretty cool if Apple would drop in a cheap ARM coprocessor in the MBP. Not just for sending mail in its sleep, but also for iOS app development, or just being able to run iOS apps on the MBP at full speed. I quite doubt we're going to see that this year though.
Thanks for that detailed info. I get what you mean now
[doublepost=1527546622][/doublepost]
Well, think of something like the touch bar and the T1 chip, which is a SoC not entirely different from what's in the Apple Watch. It's not too far from being its own independent subsystem. It's not that far fetched to imagine something similar with its own very small OS that can handle very simple tasks like waking up the main computer if it sees certain network packets. And I'm sure they'd be doing more than that if the rumours were indeed true.

Or if you wanted to imagine a SoC that is effectively an iPad on a chip, that borrows display, speakers etc from the MBP, then that would probably work fairly well too. That SoC would run iOS, the Intel chip would run macOS, and they wouldn't be scheduling apps between them.

Both of these examples are pretty similar to how a dGPU works. When you want to run a compute job on the GPU, you have to send it the program (called a kernel, just to keep things as confusing as possible). Then you send it the data (copy from main RAM to GPU RAM). Then you tell the GPU to run the program on that data, and possibly write output data to somewhere else. Then the output data has to be copied back to the main RAM. The GPU is effectively a coprocessor, but it's not SMP as when you have a dual socket workstation. They don't (typically) share RAM with the main CPU.

The key here is data sharing. If you want two processors to have shared access to the same data, then they have to communicate on how they access that data, and that makes it more complex. Probably also a bit slower. At the same time it's pretty convenient to share data and not have to spend time copying it from one chip to another. Engineering tradeoffs...

Now engineers sometimes come up with ingenious ways of creating magic. If you remember back in the 80's, the Amiga had one main CPU and several coprocessors that could indeed work independently with main RAM and other resources. This was, if I remember correctly, achieved by clocking the coprocessors at a fixed multiple of the main CPU, and then phase shifting the clock so they never actually conflicted with each other. Which was pretty neat for the original box, but fairly inconvenient later when they wanted to bump the clock speed of the CPU independently. And this was the early 80s mind you. What's possible or not possible is constantly in motion. :)

Anyway, while many weird things get written by people who don't know tech, the low power chip idea was probably fairly feasible.
[doublepost=1527521949][/doublepost]And just to keep this at least remotely on topic -- I do think it would be pretty cool if Apple would drop in a cheap ARM coprocessor in the MBP. Not just for sending mail in its sleep, but also for iOS app development, or just being able to run iOS apps on the MBP at full speed. I quite doubt we're going to see that this year though.
Yah I would love the arm chip for the same reason, but even more so for the presumed battery life improvements it could bring
 
  • Like
Reactions: RandomDSdevel
Hmm, that doesn't make much sense. The execution is part of the product. Maybe you mean the problem is not the ideas?
When Apple released iTunes with a bug that could delete data in Mac OS 10.0, was the issue due to OS X or iTunes? Because iTunes is just one part of OS X.

What about that bug that caused data to be deleted when you moved files between partitions in Leopard in 2007? Did that execution mean all of Leopard was a bad execution?

When .me failed, did that mean all of Apple's services failed? Because .me was built using the same tools as the iTunes Music Store.

Siri is just one component of the HomePod, albeit, a very terrible part of it. Reviews otherwise say its great stereo.

Siri was crap out the gate on the iPhone 4s, does that mean the iPhone 4s was a bad release overall?
 
  • Like
Reactions: RandomDSdevel
On a lighter note. With WWDC 2018 only six days away, you do know, if Apple plans to announce updates, somewhere in Shenzhen, China, a palette with standard configs are getting ready to ship around the world.

It seems Apple's doubling down on secrecy might be working. Then again, unless a factory worker can boot one and post the About this Mac dialog, you wouldn't be able to tell.
 
  • Like
Reactions: RandomDSdevel
Maybe it will happen, in a week! I believe Apple is going to remove the TB, or at least make it optional. The fact that the TB model has more fans though, really made me change my mind on buying the base model.
No way. They are not removing that thing. If anything, they will expand it.
 
  • Like
Reactions: Cloudymoon
I honestly think the problem is that the team behind the Macs don't have a "normal" notion of the daily usage that a person gives to this machine.

If you think about it, this theory does make sense... Especially when Apple said that they got tons of professionals on board to give input regarding the modular Mac Pro and so on.. Their target audience is so confusing that nobody really understands who this computers are for at the end of the day

They are for everyone hence a compromised solution. I work in design and as soon as a client wants me to design something that is super flexible and can do everything it starts to fail in delivering a focused solution. This is what the macbookpro is.
 
If you do WORK with a machine and use VMs or whatever but don't want a 15" always go for the TB. The nTB also has worse Wi-Fi module.

I'm willing to put up with all the downsides just to retain a full keyboard. I use my Esc key a thousand times more often than I would use a second fan. If Apple drop the nTB option, that's the end of the line for me with Apple hardware.
 
I'm willing to put up with all the downsides just to retain a full keyboard. I use my Esc key a thousand times more often than I would use a second fan. If Apple drop the nTB option, that's the end of the line for me with Apple hardware.
I've remapped the ESC key to the top left and that key to the bottom left. On an EU keyboard, the bottom left (/, pipe) is replicated on the right side anyway. Top left of the main block is where ESC is supposed to be anyway, if you're used to 1980s Unix terminals made for vi :)
 
When Apple released iTunes with a bug that could delete data in Mac OS 10.0, was the issue due to OS X or iTunes? Because iTunes is just one part of OS X.

What about that bug that caused data to be deleted when you moved files between partitions in Leopard in 2007? Did that execution mean all of Leopard was a bad execution?

When .me failed, did that mean all of Apple's services failed? Because .me was built using the same tools as the iTunes Music Store.

Siri is just one component of the HomePod, albeit, a very terrible part of it. Reviews otherwise say its great stereo.

Siri was crap out the gate on the iPhone 4s, does that mean the iPhone 4s was a bad release overall?
I think the person you're responding to just meant that if one aspect or feature of a given product has a problem, then the product itself has this problem aswell – not that a single flaw in a specific feature of a product makes the entire product bad which you seemed to be understanding it as. So if the execution of a given product is bad, then the product itself might not be bad in its entirety but it's only fair to account for the flaws in its execution when judging it.

In this sense your examples don't really fit that well here; for example, Siri being a lackluster voice assistant also drags down the HomePod as a product (for everyone who is at least partially looking for the "smart" in "smart speaker"). Doesn't mean that the HomePod is bad in its entirety as the excellent sound makes up for it for a lot of people. Similarly, not all of Apple's services "failed" because of .me's shortcomings, but that doesn't mean that Apple's services weren't flawed at the time because of .me. etc.etc.
 
https://semiaccurate.com/2018/05/29/is-intels-upcoming-10nm-launch-real-or-a-pr-stunt/

I seriously suggest reading this article, about Intel 10 nm process woes, and why this directly affects future of Apple.
One arm they won't be twisting for sure is Apple's. Apple can weather it through on old process technology until Intel gets it right or they start using A Series chips in their notebooks line. Unlike some OEM's, they have a healthy smartphone, tablet and services business to fall back on.

It just sucks that we might go through another drought period like in 2015 to 2016 for the MacBook Pro's. If anything though, the 13 inch MBP's will likely to see 10 NM first similar to Broadwell in the early 2015 13 inch MBP.

At this rate, roadmap releases like Icelake and Tigerlake are likely 5 years out.
 
  • Like
Reactions: RandomDSdevel
https://semiaccurate.com/2018/05/29/is-intels-upcoming-10nm-launch-real-or-a-pr-stunt/

I seriously suggest reading this article, about Intel 10 nm process woes, and why this directly affects future of Apple.
I seriously suggest that nobody read this article. It's written by an activist who's trolling against anything he can find to bitch about just to get clicks. I would strongly recommend that everyone here develop an eye for deciding what's quality news and what's garbage. This article belongs in the rubbish bin.
 
I seriously suggest that nobody read this article. It's written by an activist who's trolling against anything he can find to bitch about just to get clicks. I would strongly recommend that everyone here develop an eye for deciding what's quality news and what's garbage. This article belongs in the rubbish bin.
You're right in your description of the author. However in this instance he's pretty spot on. Technical facts listed in his article are pretty damning and self-explanatory even if you don't read his interpretation of them.
 
  • Like
Reactions: RandomDSdevel
I plan on buying a new MBP no matter what comes out (or doesn't next week). I have my mid-2012 and do a lot of video editing. It's painful to use at the moment. I'm debating getting a 13" one with a second monitor at home but it depends on if a quad core is released in that size.
 
You're right in your description of the author. However in this instance he's pretty spot on. Technical facts listed in his article are pretty damning and self-explanatory even if you don't read his interpretation of them.
This is exactly how they're bending the truth though. Some grains of fact, and then invented facts and false conclusions that serve only to support whatever lies they're trying to tell. This is how you end up "proving" that the earth is flat, how aliens are secretly infiltrating governments, how the holocaust never happened and so on. Except here it's applied to tech, and if you don't know the tech I'd argue that it's close to impossible to read that article and differentiate between the actual grains of fact, and what's conjecture.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.