But I think the problem with Apple is not necessarily the products themselves, but the execution.
This is very true. Products don't exist by themselves. They are made by people.
But I think the problem with Apple is not necessarily the products themselves, but the execution.
Maybe it will happen, in a week! I believe Apple is going to remove the TB, or at least make it optional. The fact that the TB model has more fans though, really made me change my mind on buying the base model.
Making it an option is one thing I'm sure they won't do. Removing the nTB seems possible to me, dropping the TB much less likely (too much like admitting an error), but making it an option, allowing people to actually prove them wrong about its popularity by making it possible to directly compare TB and nTB sales with no confounding variables? That seems the least likely of all to me - even more so if they were to not artificially price them the same.Maybe it will happen, in a week! I believe Apple is going to remove the TB, or at least make it optional. The fact that the TB model has more fans though, really made me change my mind on buying the base model.
Oh, the TB model has actually more fans? I didn't have a clue about this! I don't like the touchbar, I prefer much more the actual keys, but more fans is very important to be honest!
Making it an option is one thing I'm sure they won't do. Removing the nTB seems possible to me, dropping the TB much less likely (too much like admitting an error), but making it an option, allowing people to actually prove them wrong about its popularity by making it possible to directly compare TB and nTB sales with no confounding variables? That seems the least likely of all to me - even more so if they were to not artificially price them the same.
Hmm, that doesn't make much sense. The execution is part of the product. Maybe you mean the problem is not the ideas?But I think the problem with Apple is not necessarily the products themselves, but the execution.
Hmm, that doesn't make much sense. The execution is part of the product. Maybe you mean the problem is not the ideas?
Thanks for that detailed info. I get what you mean nowWell, think of something like the touch bar and the T1 chip, which is a SoC not entirely different from what's in the Apple Watch. It's not too far from being its own independent subsystem. It's not that far fetched to imagine something similar with its own very small OS that can handle very simple tasks like waking up the main computer if it sees certain network packets. And I'm sure they'd be doing more than that if the rumours were indeed true.
Or if you wanted to imagine a SoC that is effectively an iPad on a chip, that borrows display, speakers etc from the MBP, then that would probably work fairly well too. That SoC would run iOS, the Intel chip would run macOS, and they wouldn't be scheduling apps between them.
Both of these examples are pretty similar to how a dGPU works. When you want to run a compute job on the GPU, you have to send it the program (called a kernel, just to keep things as confusing as possible). Then you send it the data (copy from main RAM to GPU RAM). Then you tell the GPU to run the program on that data, and possibly write output data to somewhere else. Then the output data has to be copied back to the main RAM. The GPU is effectively a coprocessor, but it's not SMP as when you have a dual socket workstation. They don't (typically) share RAM with the main CPU.
The key here is data sharing. If you want two processors to have shared access to the same data, then they have to communicate on how they access that data, and that makes it more complex. Probably also a bit slower. At the same time it's pretty convenient to share data and not have to spend time copying it from one chip to another. Engineering tradeoffs...
Now engineers sometimes come up with ingenious ways of creating magic. If you remember back in the 80's, the Amiga had one main CPU and several coprocessors that could indeed work independently with main RAM and other resources. This was, if I remember correctly, achieved by clocking the coprocessors at a fixed multiple of the main CPU, and then phase shifting the clock so they never actually conflicted with each other. Which was pretty neat for the original box, but fairly inconvenient later when they wanted to bump the clock speed of the CPU independently. And this was the early 80s mind you. What's possible or not possible is constantly in motion.
Anyway, while many weird things get written by people who don't know tech, the low power chip idea was probably fairly feasible.
[doublepost=1527521949][/doublepost]And just to keep this at least remotely on topic -- I do think it would be pretty cool if Apple would drop in a cheap ARM coprocessor in the MBP. Not just for sending mail in its sleep, but also for iOS app development, or just being able to run iOS apps on the MBP at full speed. I quite doubt we're going to see that this year though.
Yah I would love the arm chip for the same reason, but even more so for the presumed battery life improvements it could bringWell, think of something like the touch bar and the T1 chip, which is a SoC not entirely different from what's in the Apple Watch. It's not too far from being its own independent subsystem. It's not that far fetched to imagine something similar with its own very small OS that can handle very simple tasks like waking up the main computer if it sees certain network packets. And I'm sure they'd be doing more than that if the rumours were indeed true.
Or if you wanted to imagine a SoC that is effectively an iPad on a chip, that borrows display, speakers etc from the MBP, then that would probably work fairly well too. That SoC would run iOS, the Intel chip would run macOS, and they wouldn't be scheduling apps between them.
Both of these examples are pretty similar to how a dGPU works. When you want to run a compute job on the GPU, you have to send it the program (called a kernel, just to keep things as confusing as possible). Then you send it the data (copy from main RAM to GPU RAM). Then you tell the GPU to run the program on that data, and possibly write output data to somewhere else. Then the output data has to be copied back to the main RAM. The GPU is effectively a coprocessor, but it's not SMP as when you have a dual socket workstation. They don't (typically) share RAM with the main CPU.
The key here is data sharing. If you want two processors to have shared access to the same data, then they have to communicate on how they access that data, and that makes it more complex. Probably also a bit slower. At the same time it's pretty convenient to share data and not have to spend time copying it from one chip to another. Engineering tradeoffs...
Now engineers sometimes come up with ingenious ways of creating magic. If you remember back in the 80's, the Amiga had one main CPU and several coprocessors that could indeed work independently with main RAM and other resources. This was, if I remember correctly, achieved by clocking the coprocessors at a fixed multiple of the main CPU, and then phase shifting the clock so they never actually conflicted with each other. Which was pretty neat for the original box, but fairly inconvenient later when they wanted to bump the clock speed of the CPU independently. And this was the early 80s mind you. What's possible or not possible is constantly in motion.
Anyway, while many weird things get written by people who don't know tech, the low power chip idea was probably fairly feasible.
[doublepost=1527521949][/doublepost]And just to keep this at least remotely on topic -- I do think it would be pretty cool if Apple would drop in a cheap ARM coprocessor in the MBP. Not just for sending mail in its sleep, but also for iOS app development, or just being able to run iOS apps on the MBP at full speed. I quite doubt we're going to see that this year though.
When Apple released iTunes with a bug that could delete data in Mac OS 10.0, was the issue due to OS X or iTunes? Because iTunes is just one part of OS X.Hmm, that doesn't make much sense. The execution is part of the product. Maybe you mean the problem is not the ideas?
No way. They are not removing that thing. If anything, they will expand it.Maybe it will happen, in a week! I believe Apple is going to remove the TB, or at least make it optional. The fact that the TB model has more fans though, really made me change my mind on buying the base model.
I honestly think the problem is that the team behind the Macs don't have a "normal" notion of the daily usage that a person gives to this machine.
If you think about it, this theory does make sense... Especially when Apple said that they got tons of professionals on board to give input regarding the modular Mac Pro and so on.. Their target audience is so confusing that nobody really understands who this computers are for at the end of the day
If you do WORK with a machine and use VMs or whatever but don't want a 15" always go for the TB. The nTB also has worse Wi-Fi module.
What exactly does the ESC key do. I see it, but I've never used it, ever?I'm willing to put up with all the downsides just to retain a full keyboard. I use my Esc key a thousand times more often than I would use a second fan. If Apple drop the nTB option, that's the end of the line for me with Apple hardware.
I've remapped the ESC key to the top left and that key to the bottom left. On an EU keyboard, the bottom left (/, pipe) is replicated on the right side anyway. Top left of the main block is where ESC is supposed to be anyway, if you're used to 1980s Unix terminals made for viI'm willing to put up with all the downsides just to retain a full keyboard. I use my Esc key a thousand times more often than I would use a second fan. If Apple drop the nTB option, that's the end of the line for me with Apple hardware.
What exactly does the ESC key do. I see it, but I've never used it, ever?
For me, the first thing I think of is VI, where ESC is used to start a command sequence. But it is used in other contexts also, to cancel out of a dialog box, abort an ongoing window operation, etc.What exactly does the ESC key do. I see it, but I've never used it, ever?
What exactly does the ESC key do. I see it, but I've never used it, ever?
I think the person you're responding to just meant that if one aspect or feature of a given product has a problem, then the product itself has this problem aswell – not that a single flaw in a specific feature of a product makes the entire product bad which you seemed to be understanding it as. So if the execution of a given product is bad, then the product itself might not be bad in its entirety but it's only fair to account for the flaws in its execution when judging it.When Apple released iTunes with a bug that could delete data in Mac OS 10.0, was the issue due to OS X or iTunes? Because iTunes is just one part of OS X.
What about that bug that caused data to be deleted when you moved files between partitions in Leopard in 2007? Did that execution mean all of Leopard was a bad execution?
When .me failed, did that mean all of Apple's services failed? Because .me was built using the same tools as the iTunes Music Store.
Siri is just one component of the HomePod, albeit, a very terrible part of it. Reviews otherwise say its great stereo.
Siri was crap out the gate on the iPhone 4s, does that mean the iPhone 4s was a bad release overall?
https://semiaccurate.com/2018/05/29/is-intels-upcoming-10nm-launch-real-or-a-pr-stunt/
I seriously suggest reading this article, about Intel 10 nm process woes, and why this directly affects future of Apple.
One arm they won't be twisting for sure is Apple's. Apple can weather it through on old process technology until Intel gets it right or they start using A Series chips in their notebooks line. Unlike some OEM's, they have a healthy smartphone, tablet and services business to fall back on.https://semiaccurate.com/2018/05/29/is-intels-upcoming-10nm-launch-real-or-a-pr-stunt/
I seriously suggest reading this article, about Intel 10 nm process woes, and why this directly affects future of Apple.
I seriously suggest that nobody read this article. It's written by an activist who's trolling against anything he can find to bitch about just to get clicks. I would strongly recommend that everyone here develop an eye for deciding what's quality news and what's garbage. This article belongs in the rubbish bin.https://semiaccurate.com/2018/05/29/is-intels-upcoming-10nm-launch-real-or-a-pr-stunt/
I seriously suggest reading this article, about Intel 10 nm process woes, and why this directly affects future of Apple.
You're right in your description of the author. However in this instance he's pretty spot on. Technical facts listed in his article are pretty damning and self-explanatory even if you don't read his interpretation of them.I seriously suggest that nobody read this article. It's written by an activist who's trolling against anything he can find to bitch about just to get clicks. I would strongly recommend that everyone here develop an eye for deciding what's quality news and what's garbage. This article belongs in the rubbish bin.
This is exactly how they're bending the truth though. Some grains of fact, and then invented facts and false conclusions that serve only to support whatever lies they're trying to tell. This is how you end up "proving" that the earth is flat, how aliens are secretly infiltrating governments, how the holocaust never happened and so on. Except here it's applied to tech, and if you don't know the tech I'd argue that it's close to impossible to read that article and differentiate between the actual grains of fact, and what's conjecture.You're right in your description of the author. However in this instance he's pretty spot on. Technical facts listed in his article are pretty damning and self-explanatory even if you don't read his interpretation of them.