Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

doitdada

Suspended
Oct 14, 2013
946
557
Apple was about the most balanced solution...so true. The only true innovation since they launched the Apple I about 41 years ago has been the iPhone. The iPod was a MP3 player, and didn't take on the world like Apple did with their desktop computer and iPhone.

The gimmick lab has doubled in size since 2011, and now has almost 120000 employees working for them. The brand has a wide array of products, so balancing the vast workforce may be a bigger challenge than "innovating". I bought Apple because of the balance they had. The products knew where to start and where to end. I think Apple is still good for consuming information, like streaming videos on YouTube on an iOS device or reading. The iPhone is still the best communicator out there, and the MacBook 12 is a great computer for light work, but the Pro line remains a nemesis for the people who develop games and apps for both iOS and macOS.

New blood in the Apple locker room may be the solution. Somebody with the power to axe the lag in deployment and figure out how deliver machines for the community who fuel the $28 billion market in the App Store. The Mac may only be a $7.5 billion market, but that market also makes the rest of Apple ecosystem worth the hefty price tag they attach to their iOS devices due to the software written on legacy computers like the MacBook Pro and Mac Pro.
 

Sanpete

macrumors 68040
Nov 17, 2016
3,695
1,665
Utah
The only true innovation since they launched the Apple I about 41 years ago has been the iPhone.
This implies an extremely narrow concept of what innovation is. Apple and several other manufacturers have made many innovations since then, by normal standards. It doesn't have to be something that didn't exist, only something done in a new and better way.
 

dyn

macrumors 68030
Aug 8, 2009
2,708
388
.nl
Yes, and flash/SSD prices are a fraction of what they were in 2012 (first rMBP). 256GB on a $2400 notebook?
Just as the NAND prices in 2015 were a fraction of what they are in 2016 which are a fraction of what they are now. Or in other words: NAND prices have gone up and are still doing so.

People are not happy with the current model and the volume and veracity of negativity that has emanated due to this release has reached apple's ears.
This is the MacBook Pro we are talking about, not the Mac Pro. The sales numbers are what they are and show that the MBP is a success whereas the Mac Pro is not. It's why they are changing the Mac Pro and why they are not going to change the MBP.

Also, the veracity of negativity is something we've seen throughout 2016 with just about anything. People have lost their minds and self control which now leads us to demonstrate for reality. Absolutely ridiculous.

agreed, but as reported in the news thread, it was the complaints of the MBP that finally pushed Apple to face the issues about the professional customers being very unhappy.
And if you read the interview from Gruber it weren't. It were the negative comments concerning the new Mac Pro and they came from a lot of people that used it differently than what Apple designed it for (mostly the use of multiple GPUs). That's what they apologised for and why they are changing the Mac Pro. And that was even on the frontpage of many websites such as MacRumors.

Yes, at how unhappy many people are with Apple's latest offerings.
Yet it is their best selling MBP thus far. Emotions vs data, they each tell a completely different story. Let's stick to the actual facts instead of basing things on emotions.

CPU ever-so-slightly faster, same RAM, same storage, worst keyboard ever, trackpad that can't reject extraneous input, emoji toy bar, 25% smaller battery, etc., and for what?
This is the case with every computer on the planet when compared to the previous model because we are limited by something called physics (some call it Mother Nature others God, Allah, etc.). It is not when you compare it to older models from say, 2011/2012/2013 from which a lot of people seem to be upgrading from.

To be thin?
Well we are talking about a notebook here so yes, size and weight are important things. We are still limited by physics so it still is a choice between a bulky powerful battery sucking notebook and a light thin not-so-powerful notebook. Over the course of decades it has changed somewhat because the not-so-powerful-but-energy-efficient components have become more powerful over time while not becoming less energy efficient as well as powerful-but-energy-inefficient components have become more energy efficient without becoming less powerful. They are (very) slowly but surely growing towards each other. Hence why we now have quad core CPUs in them and can run them for 8 to 10 hrs on battery vs a single core CPU being only able to run it for an hour or 2 max.

It is rather nice not to have to haul a 4kg machine around the size of a binder.

Now if you actually read the article for best laptop you quickly see that it is absolutely useless. There is no explanation whatsoever of the notebooks themselves, there is a vague definition of the categories and no mentioning of how the point are awarded (when do you get 10 out of 35, when 20 out of 35, when 23 out of 35, etc.). It's just a number playing and juggling game. The only reason for going Lenovo over Microsoft is the amount of points. Not because it has better support, faster hardware, better battery life and so on. If they had actually done that, it would have been useful.

I'll have to check them out, I know Lenovo has had a good reputation with the thinkpads
With emphasis on "had" if you believe the Thinkpad users that have used them since the IBM era. There have been many many complaints about the keyboard, Linux support, the installed software (bloated and they even supplied malware) as well as the general build quality (which, according to users who've been using Thinkpads for a long time is decreasing ever since IBM sold it to Lenovo; I don't entirely agree with that because we went through some difficult times when it comes to the economy, even IBM would have made some compromises on build quality to cut costs).
 

leman

macrumors Core
Oct 14, 2008
19,202
19,062
Apple was about the most balanced solution...so true. The only true innovation since they launched the Apple I about 41 years ago has been the iPhone.

Your ignorance and sarcasm would be funny if it weren't this arrogant. Significant examples of Apple's innovation, just of top of my head:

- the ultrabook category: it was Apple who has asked Intel to design the first ULV CPU that would enable construction of MacBook Air. Initially met with criticism, this approach has proven itself incredibly successful and defined the state of CPU technology until now
- redefining wired connectivity: firewire and later, more significantly, Thunderbolt
- Apple reportedly convinced Intel to invest more R&D into their integrated GPU, triggering a massive boost of iGPU performance and making Apple at least partially responsible for the transformation of the iGPU from the pointless things they were into formidable GPUs able to compete with lower-end dedicated cards
- popularising GPU compute: OpenCL was developed by Apple
- first manufacturer to use a HiDPI display on a consumer device
- popularisation of multi-touch technology in the consumer space
- AFAIn first true auto-tiered storage solution in the consumer space
- LLVM, while started as independent project, was picked up and supported by Apple, which brought a new major player into the compiler landscape and massively influenced software development
- research and popularisation of safe programming languages, not really innovation, but immensely important

What kind of innovation did Dell do in the last twenty years? Its pinnacle was building a cheaper MBP clone.
 
Last edited by a moderator:

doitdada

Suspended
Oct 14, 2013
946
557
Your ignorance and sarcasm would be funny if it weren't this arrogant. Significant examples of Apple's innovation, just of top of my head...

Good list, but most of them fall under optimisation rather than innovation. The ULV movement is more of a trend than innovation, though the fanless MacBook may go under innovation. In fact the MacBook 12 is the computer I have been waiting for since Intel was introduced to the Mac.
 
Last edited:

Melrose

Suspended
Dec 12, 2007
7,806
399
I bought a new 13" Air a year or two ago and I can't be happier with it. There's some rabid fanboys who love the rMacBook, but honestly they're hugely overpriced for what they actually are.

But when upgrade time rolls around, which one will I get if they kill the Air? That touchbar nonsense just looks like...nonsense. Plus I just got through upgrading all my dongles to USB stinking 3.0 about 9 months ago. I'll continue using Mac and iOS because they're more stable and secure than Android, but Apple is definitely not what it used to be.

Your ignorance and sarcasm would be funny if it weren't this arrogant. Significant examples of Apple's innovation, just of top of my head:

...

And yet you left out the revolutionary products Apple has had. The iPhone. The LaserWriter. GUI to the masses. Typefaces and desktop publishing. Things like this impact the way society communicates, and that's huge.

...good heavens, even the layout of our modern laptops is from Apple.
 
Last edited:

dyn

macrumors 68030
Aug 8, 2009
2,708
388
.nl
- redefining wired connectivity: firewire and later, more significantly, Thunderbolt
Unlike Firewire, Thunderbolt is not a technology from Apple. Thunderbolt is a technology from Intel. Apple helped them out when Intel asked them for help for the connector. This connector has now been replaced by one from the USB-IF. Everything else is Intels work, not Apples.

I wouldn't say Apple did a great job in promoting Thunderbolt. They included it on their machines and they made a few accessories but that's all they did. The 2016 MBP is a real innovation because Thunderbolt is the only port on that machine (well beside a 3.5mm jack). However I don't think it is Apple that made it more popular, I think it is what Intel included in Thunderbolt 3 that is making it very popular (most notably USB3.1 Gen 2 and USB-C). That actually explains the fact we are now seeing more machines with Thunderbolt.

- Apple reportedly convinced Intel to invest more R&D into their integrated GPU, triggering a massive boost of iGPU performance and making Apple at least partially responsible for the transformation of the iGPU from the pointless things they were into formidable GPUs able to compete with lower-end dedicated cards
Apple wasn't the only one doing that, all of the OEMs did so because they could sell cheaper machines. AMD's APU was even a big threat to Intel on this regard so they had to respond to it.

- popularising GPU compute: OpenCL was developed by Apple
Apple did no such thing. Nvidia came up with GPU compute and called it CUDA and it has became the most popular. Nvidia is doing enormous amounts of popularising it amongst various different people (companies, students, etc.). They for one are offering special devices for use in supercomputers, workstations, servers and even via Thunderbolt or remote (they have something akin to AWS). Apple nor any of the other OpenCL members do any of that. In fact, Apple created it, turned it into a generic technology and never spoke about it again. They are not even amongst the working group members (only as an implementer). But all that is to be expected from a company that ditched their grid computing technology (Xgrid).

- research and popularisation of safe programming languages, not really innovation, but immensely important
As do many others so no difference there. I think it is better to mention the privacy aspect, that's something not that many defend as much as Apple does.

What kind of innovation did Dell do in the last twenty years? Its pinnacle was building a cheaper MBP clone.
Dell and Apple are very different companies. Apple is not in the enterprise world when it comes to storage, servers, software (they are now entering that one with iOS with the help of Cisco and IBM) and Linux but Dell is. All of Dells innovations lie in those segments and there are plenty.
 

leman

macrumors Core
Oct 14, 2008
19,202
19,062
Good list, but most of them fall under optimisation rather than innovation.

I agree that the notion of "innovation" is rather vague. Personally, I use this term to refer to choices/designs etc. that have resulted in impactful changes. E.g. ULV CPUs have kickstarted the ultrabook category and ultimately resulted in the modern shift to CPUs that can dynamically adjust their performance and be fast and efficient at the same time. This had profound changes on the capability of the modern computers and Apple's influence here is undeniable. Similar goes for HiDPI displays etc.

If we are strictly talking about inventions, then there is of course not much that Apple or any other company can claim for themselves. Apple's chief achievement is taking such inventions and bringing them to the market/making them impactful. If you want to focus on inventions only, there are plenty of things here on a more technical level. Like the manufacturing process for unibody Macs etc. But these are less visible or impactful.

And yet you left out the revolutionary products Apple has had. The iPhone. The LaserWriter. GUI to the masses. Typefaces and desktop publishing. Things like this impact the way society communicates, and that's huge.

I was replying in my post to a previous post, that person has already included the iPhone. I tried to focus on post-iphone things.
 

dyn

macrumors 68030
Aug 8, 2009
2,708
388
.nl
Ah, forgot about the ULV one. That cannot be attributed to Apple either since Intel already had those CPUs before Apple switched from PowerPC to Intel. What Apple did was request a smaller size CPU from Intel so they could use it in a small and light notebook (aka the MacBook Air). That chip was a one off. Perhaps it was a tickle for Intel to start the Ultrabook category but we mustn't forget that it was Intel who wanted notebooks to become more popular and started the entire Centrino thing (which was about the CPU, chipset and wifi). Ultrabook is merely an extension on that.
 

leman

macrumors Core
Oct 14, 2008
19,202
19,062
Unlike Firewire, Thunderbolt is not a technology from Apple. Thunderbolt is a technology from Intel. Apple helped them out when Intel asked them for help for the connector. This connector has now been replaced by one from the USB-IF. Everything else is Intels work, not Apples.

I was under impression that it was a join development? I totally agree that the start was slow, but I think it was mostly because Thunderbolt was expensive and not many vendors were interested in adopting it. In the end, it was a fairly limited market. But it kickstarted some very important development in connectivity.


Apple wasn't the only one doing that, all of the OEMs did so because they could sell cheaper machines. AMD's APU was even a big threat to Intel on this regard so they had to respond to it.

I was referring to a different thing. Low-performance CPUs existed at that time, true. But they were indeed low-performance and targeted at the low-end part of the market. But by asking Intel to develop a custom, more efficient CPU version for the consumer market, they have started moving the "meta" in a different direction. Also, we are talking about 2008 here. I though AMD APUs arrived much later?

Apple did no such thing. Nvidia came up with GPU compute and called it CUDA and it has became the most popular. Nvidia is doing enormous amounts of popularising it amongst various different people (companies, students, etc.).

And yet Apple has created OpenCL. CUDA is a proprietary Nvidia tech and of course Nvidia is promoting it — it brings them money. Frankly, this is a dick move by Nvidia and nothing worth of praise.

They are not even amongst the working group members (only as an implementer). But all that is to be expected from a company that ditched their grid computing technology (Xgrid).

Well, they have since developed Metal which is supposed to be OpenCL replacement (I believe that Apple in the end was fairly disappointed at what became of their tech). As to Xgrid, come on, there are open-source implementations out there that work just as well. What is the point to maintain useless software stack if you are not in a supercomputing business? Xgrid was a hobby project by someone at Apple, nothing more.

Dell and Apple are very different companies. Apple is not in the enterprise world when it comes to storage, servers, software (they are now entering that one with iOS with the help of Cisco and IBM) and Linux but Dell is. All of Dells innovations lie in those segments and there are plenty.

The context was comparing innovation in the laptop market.
 

dyn

macrumors 68030
Aug 8, 2009
2,708
388
.nl
I was under impression that it was a join development? I totally agree that the start was slow, but I think it was mostly because Thunderbolt was expensive and not many vendors were interested in adopting it. In the end, it was a fairly limited market. But it kickstarted some very important development in connectivity.
It wasn't. Intel had something called Lightpeak which they demonstrated before Apple even came into play. Due to their partnership they went to Apple for advise on the connector and they ended up using the mini-DisplayPort connector Apple developed.

The start wasn't that slow, there are many Thunderbolt devices but those are in the high end professional segment. You don't really see them in the consumer segment. Consumers go with USB devices instead because it fits the requirements just fine (Thunderbolt is basically overkill). That doesn't change with Thunderbolt 3 and neither does the pricing which in fact has gone up (a TB1 or TB2 dock was about 200 USD but TB3 docks are 300+ USD).

With the use of USB-C for the cabling and the inclusion of USB 3.1 Gen 2 in Thunderbolt 3 it simply means that it is now a port for all instead of the power user alone. Most likely that is the reason why Apple could afford going all Thunderbolt 3 on the MBP. This is also the entire idea behind Lightpeak/Thunderbolt: Intel wanted to create a technology that you could use as a single port.

But by asking Intel to develop a custom, more efficient CPU version for the consumer market, they have started moving the "meta" in a different direction. Also, we are talking about 2008 here. I though AMD APUs arrived much later?
The iGPU story started with OEMs wanted to sell cheaper machines because customers with a low budget and companies that had no use for a dGPU were requesting it. You could also build smaller devices with those. It doesn't matter when the AMD APUs arrived, for Intel it was a threat and a reason to keep the development of the iGPU going. At a certain moment in time they even went after the dGPUs from AMD and Nvidia.

And yet Apple has created OpenCL. CUDA is a proprietary Nvidia tech and of course Nvidia is promoting it — it brings them money. Frankly, this is a dick move by Nvidia and nothing worth of praise.
Many people use CUDA because of its features. If you read the wikipedia article about OpenCL there are some links to some research which have found that CUDA performance is better (although that in some cases can be solved with doing things a bit differently).

Well, they have since developed Metal which is supposed to be OpenCL replacement (I believe that Apple in the end was fairly disappointed at what became of their tech).
No it's not. Metal is an API that allows applications to directly access the GPU for graphics. It is a replacement for OpenGL at most but has got nothing to do with OpenCL.

As to Xgrid, come on, there are open-source implementations out there that work just as well. What is the point to maintain useless software stack if you are not in a supercomputing business? Xgrid was a hobby project by someone at Apple, nothing more.
Apple dropped Xgrid and that was it. There was no promoting of this type of computing whatsoever. Nothing.

The context was comparing innovation in the laptop market.
Exactly. Dell is in a different market so most of their innovations are in a different market too. Dell had docking stations, the accupoint (or whatever they call it), expandability of the bay that housed the optical bay, a notebook that fully supports Linux (so much so that they even sell it with Linux) and so on which Apple didn't. It's not like Dell doesn't innovate at all.
 
Last edited:

leman

macrumors Core
Oct 14, 2008
19,202
19,062

Thanks about your insights about Thunderbolt/CPU, I was not aware of some of the details. Just a few comments about GPU compute below:



Many people use CUDA because of its features. [...]

I have done GPU programming with CUDA, OpenCL and Metal. CUDA is arguably the most feature-complete, but it is still locked to a single vendor. Personally, I believe that proprietary tools like CUDA are not the way to go, for obvious reasons. Provided maturity of a compiler, there should be zero performance differences on the same hardware, no matter which API is used.

No it's not. Metal is an API that allows applications to directly access the GPU for graphics. It is a replacement for OpenGL at most but has got nothing to do with OpenCL.

Metal is a replacement for both graphics and compute. You don't need to use any of the graphical sub-api (or in fact, even allocate any of the graphics-related objects) to do compute with it. It still misses some features for scientific applications (like FP precision control), but it can already be used as replacement for OpenCL to a large extent. And its extremely easy to use.
 

dyn

macrumors 68030
Aug 8, 2009
2,708
388
.nl
Ah yes, you're absolutely right, apparently I forgot about it also being for compute (probably because I only remember the gaming demos).

Anyway, CUDA might be a vendor lock-in technology but it has become the de facto standard. Metal is also a vendor lock-in but I'm not seeing them compete with CUDA (Mac only vs a proprietary technology that can be used in Windows, Mac and Linux). OpenCL has a better change at that because it is used by so many manufacturers. OpenCL is cool but it simply didn't catch on.

I just wish Apple would spent more time on promoting these technologies. Things would have been very different with the Mac Pro if they simply demoed all the neat things you can do with it (by using things like Thunderbolt and OpenCL). They only do a bit of that when they introduce a new feature and then it stops.
 

leman

macrumors Core
Oct 14, 2008
19,202
19,062
Metal is also a vendor lock-in but I'm not seeing them compete with CUDA (Mac only vs a proprietary technology that can be used in Windows, Mac and Linux). [...] OpenCL has a better change at that because it is used by so many manufacturers. OpenCL is cool but it simply didn't catch on.

Well, Metal is not really vendor lock-in, its platform lock-in (it works on different hardware unlike CUDA, which only works on Nvidia's hardware and is in fact designed specifically for Nvidia hardware. But I certainly agree with you that Metal is certainly not meant to compete with CUDA, just give Apple developers a unified way to harness the power of the GPU across their platforms.

As to OpenCL... I am not sure that it didn't catch on. After all, it is implemented on all major OSes by all major hardware vendors. The problem here is rather political. OpenCL 2.+ has a lot of important new features and can very well compete with CUDA, but with Nvidia being very resistant to offer proper OpenCL implementation — the CL 2.0 implementation only landed in Nvidia beta drivers in February! And of course they don't support OpenCL 2.1+ because that would compete too much with their own CUDA — and Apple freezing their OpenCL implementation, most likely in favour of Metal, the landscape is now uncomfortably fragmented. Of course, we are experiencing a paradigm shift right now, so some volatility is to be expected, but I would wish that a major player like Apple would show more support for open-source initiatives such as OpenCL 2+ and Vulkan. They can still keep Metal, as a more convenient wrapper, but having a cross-platform low-level GPU API would benefit everyone.
 

dyn

macrumors 68030
Aug 8, 2009
2,708
388
.nl
It is actually both a vendor lock-in as well as a platform lock-in because you can only use it on Apple computers (vendor) running macOS (platform). With CUDA you may be tied to their GPUs but you can use almost whatever computer and OS you want. That means far more flexibility for companies and such that are using it (for example, it is not uncommon that in science you have people using Linux while others are using Windows or macOS). Either way they are proprietary and you are locked-in to something. Not something that is entirely uncommon. Take a look at how many people depend on software like Matlab or MS Office for example.

Btw, I meant OpenCL catching on with users, the ones who are actually using it. The fact that so many manufacturers implemented it is the exact opposite, they show it is quite a success.
 

sahnjuro

macrumors regular
Jul 15, 2009
101
65
Not all SSDs are born equal. Compare pricing of a Samsung 850 EVO SATA to a Samsung 960 Pro NVMe. Two completely different tiers in performance and pricing. Guess which one Apple uses in the 2016?

You might as well be comparing pricing for a 1TB 5400RPM HDD with a 1TB SATA SSD with that argument.

Apple don't use crappy SSDs in their machines.

Yes, they do. Got Sansdisk 256gb in my MBA. It died after a couple of years.

I asked nicely if they can install Samsung instead of Sansdisk as a replacement and they obliged.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.