Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

mode11

macrumors 65816
Jul 14, 2015
1,303
969
London
In a nutshell, Apple is all about minimalism and purity in hardware design. Their products are not about having the most number of features or being the most useful, but about being the purest mix of form and function.
Fair enough - as a long time Apple user, it's a big part of what attracts me to their products.

Apple is all about making great products, but their definition of “great” is always from the eyes of their design department, not the general public.
Also fair enough. Having the courage of their convictions - whether they are right or wrong in specific instances - is part of what makes them a design leader, not a follower.

It’s why the iPhone never had expandable storage or removable batteries, because Apple felt that supporting them would compromise the integrity and beauty of the device.
Yes, though not including expandable storage also has the happy coincidence of pushing customers to buy the biggest capacity device they can afford. It's not like the public are bamboozled by the SIM card slot; micro-SD is pretty much the same thing.

Thin, light and uncompromisingly simply. That’s what makes a great product to Apple, not an endless list of features.
Sure, no one wants an 'endless' list of features. But there's a difference between aesthetic simplicity (great in the showroom) and operational simplicity. Having one port on a laptop looks simple, but carrying a bunch of dongles is more of a faff in terms of actually using the device. As MBP users find when people hand them USB sticks, or they need to use a projector.

I will say that in their own way, Apple likely thought that they were legitimately doing their user base a favour by trying to migrate people over from laptops to iPads. Or that a 2016 MBP with twin LG 5k displays and a e-GPU would suffice as a desktop replacement.
An iPad is, at best, a laptop without a keyboard. How is this an improvement (even ignoring iOS's very limited multitasking)? A laptop that puts all the weight in the screen, and is available in a maximum size of 13". The laptop may be an earlier invention, but it actually works pretty well - it's not a poor solution crying out for reinvention. This seems to be more about Apple's search for new markets than actually advancing the customer experience. And how is a laptop + an eGPU + a couple of screens less faff than a desktop PC? It seems like a bizarre workaround to a problem that only exists because for business reasons, Apple isn't interested in selling the latter.

Apple thought (likely still thinks) they knew better than their pro user base, and they thought wrong in this case. The Mac community proved to be a lot more resistant. That and some people legitimately needed the full power of a Mac Pro for their work. So Apple ultimately felt that the pro user segment, niche of a niche as it was, wasn’t a demographic they were really to give up just yet.

It's not always about "people legitimately needing the full power of a Mac Pro". Those specific customers are, by definition, fine with the 2019 MP. What about someone who wants an expandable machine with e.g. an 8 core CPU and an RTX 3070-level GPU? The only reason there's no Mac option is because Apple rightly fear the canibalisation of more profitable parts of their range. Why is it that despite everyone normally being happy to copy Apple, there's few examples of AIO Windows PCs? Because when people actually have a choice, as they do in the wider PC market, they would buy a laptop or (mini) tower instead.
 

mode11

macrumors 65816
Jul 14, 2015
1,303
969
London
Apple’s never satisfied the majority of customers with the Mac. I don’t think they’ve had too much over 25% marketshare ever. They’ve always been about focusing on a profitable minority in pretty much everything they’ve ever done.
Spot on. Though it's more like 5% marketshare of the PC market.
 
  • Like
Reactions: Unregistered 4U

Abazigal

Contributor
Jul 18, 2011
19,579
22,046
Singapore
Having one port on a laptop looks simple, but carrying a bunch of dongles is more of a faff in terms of actually using the device. As MBP users find when people hand them USB sticks, or they need to use a projector.
Do people really need a bunch of dongles and adaptors though?

Here's my experience with the HP EliteX2 which is issued to me at work (as a teacher). It comes with 1 USB A and 1 USB-C port. We are also each allocated one travel hub, which is a USB C adaptor with passthrough charging, 2 USB-A ports, HDMI and VGA. So one adaptor basically has all the ports a teacher needs, though we joke that it's thicker than the laptop itself.

At my desk, I have a 23" monitor, a USB-C charger and a multi-USB hub connected to the travel hub, which means I only need to plug in one USB-C cable to my laptop to power everything. I know it sounds like a very funny first-world problem to gripe about, but it's so convenient to be able to just disconnect a single cable when it's time for me to pack up and leave my desk.

Moving around, I have replaced the travel hub with my own Apple HDMI adaptor (the 3-port version with the single USB-A and charging port), which is so much smaller and more compact. It also works with my iPad Pro, but I have largely migrated to a wireless+USB-C setup. For example, I have this flash-drive that has both USB-A and USB-C ports, allowing me to use it with my iPad as well. All classrooms have Miracast dongles that allow teachers to project their laptops wirelessly to the whiteboard, and I have my own Apple TV installed as well to mirror my iPad.

And yes, we do have colleagues who forget to bring it with them to class. It happens. On my end, I like to think that it's short-term pain for long-term gain. Carrying adapters is a current annoyance that the best of us have to endure, but I am hoping that this will in turn spur people on to adopt more USB-C peripherals (though the more time passes, the more I feel this is somewhat of a pipe dream).

Sometimes, the tides of change must be forced.
An iPad is, at best, a laptop without a keyboard. How is this an improvement (even ignoring iOS's very limited multitasking)? A laptop that puts all the weight in the screen, and is available in a maximum size of 13". The laptop may be an earlier invention, but it actually works pretty well - it's not a poor solution crying out for reinvention. This seems to be more about Apple's search for new markets than actually advancing the customer experience.
Think in terms of jobs to be done.

Many people bought a computer in the last 20 years mostly because they wanted internet access. But they didn’t really want a computer, just the internet. In an abstract way, it’s like how if you wanted a portrait of your family, you would either have to paint it yourself or hire an artist. Then photography became a thing, became cheap and accessible and the painted portrait remains on the fringes.

It's an improvement for the salaried worker who comes home after a long day at work and just wants to browse Facebook or watch Netflix on the sofa. The simplicity of iOS would actually be an advantage over the complexity of a desktop OS when all you are doing is launching apps to access content.
And how is a laptop + an eGPU + a couple of screens less faff than a desktop PC? It seems like a bizarre workaround to a problem that only exists because for business reasons, Apple isn't interested in selling the latter.
I imagine that Apple had envisioned the future of computing to be impressive power in a thin and light package that can tether to an ultra-powerful rig when needed. They were trying to sell users an idea. Not just a product.

Imagine this scenario - when you are out and about, you have a very capable, yet thin and light laptop that can give you sustained performance even when away from a power outlet. This is because Apple values portability as much as raw power. This is in contrast with windows laptops that sport more powerful hardware specs on paper, but then throttle hard when not plugged in to a power source.

Then when you are at a desk, you can hook your laptop up to a Thunderbolt Display, an E-GPU, multiple hard drives, even fit in an ethernet cable, all with one dock.

By plugging in a single cable, you get power, display and data. It's kinda alike having the best of both worlds, but boy will you have to invest a small fortune in this specialised setup that is near impossible to upgrade readily.

Then when it's time to go, just disconnect the cable and leave.

I suspect it's partly why Apple decided to kill MagSafe as well. A cable that comes apart when tugged is actually a liability when it's responsible for transferring data and powering an external display. The versatility of USB-C made it the ideal port for Apple to go all in on, because how else are you supposed to spur adoption of USB-C accessories and workflows other than by burning bridges and signalling to the industry that you are going all-in on this?
It's not always about "people legitimately needing the full power of a Mac Pro". Those specific customers are, by definition, fine with the 2019 MP. What about someone who wants an expandable machine with e.g. an 8 core CPU and an RTX 3070-level GPU? The only reason there's no Mac option is because Apple rightly fear the canibalisation of more profitable parts of their range. Why is it that despite everyone normally being happy to copy Apple, there's few examples of AIO Windows PCs? Because when people actually have a choice, as they do in the wider PC market, they would buy a laptop or (mini) tower instead.
Like you said, it's a "want", not a "need".

More people than not "want" an expandable machine because they want to be able to just buy the base model and add additional ram and storage themselves to save some money. In terms of performance, the iMac (specced accordingly to use case) actually more than suffices for many people. Then you get the iMac Pro for those who needed even more computing power. You just have to be prepared to pay for it.

I also suspect the reason why we haven't seem more Windows AIO models on the market is due to price. More specifically, the centrepiece of any AIO is the display, which is probably also the most expensive part of the computer and will jack up the price significantly. Apple users are willing to shell out thousands of dollars for an iMac because Apple has, for better and for worse, aggregated the best spenders. To put it another way, Apple customers tend to have more disposable income and don't mind splashing out more cash for "niceness".

That and if you wanted a decent Mac desktop, the iMac is your only real option.

Windows users, not so much. I have seen a number of HP-branded AIOs and they just don't seem as well built.

That's just my thoughts, at least.
 

mode11

macrumors 65816
Jul 14, 2015
1,303
969
London
Point taken about the convenience of using one cable to attach a laptop to a monitor for power, video, USB and all the rest. USB-C definitely scores highly for this and is definitely a step forward. The question is really whether to include USB-A, HDMI and so on as well. As ever with these things, including the old version lessens the incentive / need to move to the new version. The early adopters bear the brunt of going all-in with a new port, though I accept that a compact 3-way dongle likely covers most typical uses.

Using a double-ended USB stick makes sense, and I'd likely do the same, but USB-A sticks continue to be prevalent, not least due to their compatibility with everything made in the last couple of decades. Any stick anyone else passes to you is likely to have a USB-A connector.

I completely agree about iPads for casual computing - my 2015 iPad Pro is a constant feature on the sofa, and I use it daily. I just wouldn't be tempted to work on it. I have used it as a second screen for a laptop, but tbh it makes more sense to just use my desk setup - far better ergonomically.

The problem with eGPU docks is that they cost about as much as a small PC, before you've added the GPU. May as well just get a separate desktop. Also, the laws of physics mean that for any given process (7nm, 5nm etc.), a computer that's connected to the mains and has no significant size restriction will always be able to outgun a thin laptop that's necessarily geared for efficiency. At the very least, it will be a much cheaper way of doing it - e.g. a mid-level desktop chip vs an aggressively binned top-end mobile chip. Sure, maintaining one computer rather than two has some advantages, but documents are generally synced via the Internet anyway these days, and redundancy isn't necessarily a bad thing.

From the customer's point of view, upgradeability clearly makes sense. Not because most people are going to stuff their machine full of upgrades, but because it means the door is not closed to upgrading after the point of purchase. You could e.g. order a Mac with a 500GB SSD, then stick a 2TB one in later, as you need it / they come down in price. And not everyone needs to be a 'PC expert' - most families have or know someone who is confident enough with computers to open a side door and slot in a card or drive. Of course, a company like Apple that is renowned for its eyewatering upgrade prices of RAM, SSDs and GPUs would really rather you didn't have the option of circumventing them.
 
  • Like
Reactions: Abazigal

theluggage

macrumors 604
Jul 29, 2011
7,507
7,401
The number of loyal Apple II customers shrank

Well, yes, but that wasn't exactly Plan A - and that era saw Apple go from possibly the Top Dog of the early personal computer market to a cult niche. Remember that the Apple III was a massive failure, leaving the Apple II series to be extended long past its sell-by date. Then the Lisa was a commercial failure. By the time the Mac came up, Apple were already having their lunch eaten by the IBM PC (unless I've grossly misunderstood the 1984 Superbowl ad and the Big Brother character on the screen was meant to be Steve Wozniak...?)

Apple probably couldn't have done anything to avoid the IBM PC crushing everything in its path, since that had nothing to do with technical merit and everything to do with "nobody ever got fired for buying IBM", but not having a proper successor for the Apple II certainly didn't help.

Also, different times technically: the Apple II was an 8-bit system running mostly lovingly-hand-crafted hardware-specific code. That was never going to "scale" to more powerful 16-bit systems with more sophisticated operating systems and layers of hardware abstraction, true bitmapped colour graphics etc. and was always going to have to be torn up and thrown away sometime in the late 1980s. Things are a lot more mature and stable now - if Apple Silicon is the "next big thing" then look at how so much x86 Mac software just needed re-compiling and (relatively) minor tweaks to become fully fledged Apple Silicon apps (compared with "back to the drawing board" for Apple II -> Mac).

They could drop the Mac Pro line tomorrow and there would be no significant impact seen in the next quarterly results, so no.

Yes, let's all phone our employers and ask for a 15% pay cut. I mean, everybody here can apparently afford premium electronics products, so we could all probably take a 15% hit with no serious hardship. What do you mean, "Why would I do that?"

...plus, that would leave Apple even more dependent on iPhone income, which is highly seasonal and highly fashion-dependent. One day, something like "Yes, my Grandad has an iPhone" could go viral and iPhone sales are going to catch a huge cold. Diversification is good.

Then there's the slight matter of all those iOS developers who need Macs as development systems... although I suppose XCode for Windows would be feasible. Oh, and they'd loose some of their sales of iPhones, services, etc. because although owning a Mac isn't the only incentive to go with iPhone, Apple Music etc. for the "seamless integration" it's a biggie, and Mac owners tend to have deeper than average pockets...

Going back to a prior post, the Mac Pro as a system was REALLY not needed. They were satisfying the vast majority of their professional customers with something that was already available.
The Mac "ecosystem" needs a Mac Pro system, unless the Mac is going to evolve into a toy for basic office work and updating your facebook.

First, developers & "techies". They're actually one of the worst served by the 2019 Mac Pro because (in general) they don't need 28 core Xeons or quad GPUs, but they do like things like flexible, multi-display setups, lots of affordable ram for virtualisation, multiple drives for booting different OS versions etc. They also need flexibility since hardware requirements may vary enormously from one job to the next (unlike someone, say, running FCPX or Logic every day).

If you alienate the developers and techies, sooner or later, the supply of software and tech support is going to dry up. Not to mention - these are potentially the #1 advocates for evangelising MacOS to potential customers. They're the people who kept Mac users going in Wintel-dominated workplaces during the darkest days of the 90s.

Then, there's all the creative pros that are currently happy with their MBPs and iMacs. Well, yes, but that depends on there being "pro" software and hardware. It can also depend on MacOS systems further up the chain to do the heavy lifting. If a video production company has to go to PC/Windows to get a stable, high-spec Xeon system to reliably handle their biggest rendering jobs, or take specialist PCIe hardware, then it usually makes sense to have everybody in the chain running the same/compatible OS and apps, rather than having to support two platforms. There will be endless pressure on Mac users to "conform".

...next thing you know, some of the third-party "creative pro" apps will be falling further and further behind the PC alternatives, because the most demanding and influential users are using PC. Sooner or later Adobe: "Well, Photoshop Elements satisfies the needs of 80% of our Mac users, so why bother to support full CS? We don't give a wet slap if the other 20% buy PCs, because they'll probably still use our software..."
 

chucker23n1

macrumors G3
Dec 7, 2014
8,564
11,307
Do people really need a bunch of dongles and adaptors though?

Here's my experience with the HP EliteX2 which is issued to me at work (as a teacher). It comes with 1 USB A and 1 USB-C port. We are also each allocated one travel hub, which is a USB C adaptor with passthrough charging, 2 USB-A ports, HDMI and VGA. So one adaptor basically has all the ports a teacher needs, though we joke that it's thicker than the laptop itself.

At my desk, I have a 23" monitor, a USB-C charger and a multi-USB hub connected to the travel hub, which means I only need to plug in one USB-C cable to my laptop to power everything. I know it sounds like a very funny first-world problem to gripe about, but it's so convenient to be able to just disconnect a single cable when it's time for me to pack up and leave my desk.

Moving around, I have replaced the travel hub with my own Apple HDMI adaptor (the 3-port version with the single USB-A and charging port), which is so much smaller and more compact.

I think that's just about all you need:

  • a "docking station"-style hub that's a bit bigger but contains the stationary stuff on your desk with your laptop. (I'd include Ethernet in that as well.)
  • a portable dongle-style adapter that has just the essentials (which, for many, indeed include USB-A and HDMI)
I don't have this kind of laptop yet, but I imagine there'd be two problems in practice:

  • it's slightly disconcerting that none of that is included. You're already buying a relatively expensive laptop, and you basically have to buy one or two such adapters on top of it. Which is great in that it makes you more flexible in your buying choices, but not so great in that MBPs are pricey to begin with.
  • this approach is prone to you forgetting the adapter while on the go. On my 2013 rMBP, I've experienced the "aw crap, I forgot the Ethernet dongle" moment a few times. Likewise, with this, I'd run into "I only have the power brick, but neither Ethernet no HDMI". Maybe not often, but when it does happen, it's bound to happen in the most embarrassing of moments.
On the bright side, unlike the Thunderbolt 2 with my rMBP, USB-C is (increasingly!) common, so odds aren't that low that wherever I show up (client, hotel room, etc.), they have a matching adapter themselves. Or perhaps a nearby gas station does.

Still, this entire problem doesn't exist if the laptop has such ports built in. I think the "Retina" era of MBPs struck a better balance there than the "Touch Bar" one.

Connecting just one cable to your desk and having everything just work is great. (Right now, I connect MagSafe, a Belkin Thunderbolt dock, and an HDMI cable, because I have two external displays. With a tbMBP, I'd probably just have to connect a USB-C dock and an HDMI cable. Not bad!)

 

mode11

macrumors 65816
Jul 14, 2015
1,303
969
London
Apple probably couldn't have done anything to avoid the IBM PC crushing everything in its path, since that had nothing to do with technical merit and everything to do with "nobody ever got fired for buying IBM"
It was also because IBM needed an OS for its new 'Personal Computer', and hired Microsoft to write PC-DOS. Then a canny Bill Gates rewrote it enough to avoid lawsuits, whilst still being fully compatible, and sold it as MS-DOS to anyone who wanted it. With this combination of IBM software compatibilty and commodity hardware, it quickly became the industry standard.

At the start of any tech revolution, people don't necessarily care which standard becomes the standard (e.g. HD-DVD or Blu-Ray), they just don't want to back the wrong horse. Once a standard emerges as the leader, everyone then piles in.
 

mode11

macrumors 65816
Jul 14, 2015
1,303
969
London
Still, this entire problem doesn't exist if the laptop has such ports built in. I think the "Retina" era of MBPs struck a better balance there than the "Touch Bar" one.
Yes. By all means add USB-C / Thunderbolt 3, but the existing ports were widely used and useful (including Magsafe).
 

theluggage

macrumors 604
Jul 29, 2011
7,507
7,401
It was also because IBM needed an OS for its new 'Personal Computer', and hired Microsoft to write PC-DOS.

Let's fix that for you:
and hired Microsoft to write PC-DOS buy and re-badge 86-DOS/QDOS/whatever from a third party

Except the other contender for the IBM-PCs OS was CP/M-86 (of which PC/MS-DOS was basically a clone - and that's putting it tactfully) which itself was an update of CP/M-80, which would also have been freely available to other PC manufacturers. CP/M-80, CP/M-86 and PC/MS-DOS weren't "binary compatible", but they were a gnat's whisker away from being source-compatible: porting code was fairly straightforward. CP/M-80 had been around since the first real personal computers, was sort-of multi-platform and probably the main competitor to the Apple II in it's heyday. Heck, you could run CP/M on an Apple II at a pinch (although that involved plugging in a Z80 processor card).

There were third party MS-DOS machines around for a long time, which didn't set the world on fire. Trouble is, whereas previous CP/M software had to cater for a range of slightly different platforms (often coming with a patch program that let you specify video RAM addresses or terminal escape-codes if you had a more obscure system) IBM got so dominant in the serious-computing-for-people-in-suits market that software started to be hard-coded for the PC and its proprietary BIOS firmware, so the generic MS-DOS machines wouldn't run IBM PC software. The turn-around came when some bright spark (not Microsoft) devised a clean-room software cloning technique that let them produce a 100% IBM compatible clone without being sued by IBM (today, IBM would just have stomped them into the dirt with software patents...) - then the PC clones took off. Having "generic" MS-DOS helped, but only after IBM achieved enough dominance for it to be worth the cost and risk of cloning the BIOS. Tragedy is, hardware was just getting good enough to start running proper operating systems like Unix with more hardware abstraction and true code portability, then IBM comes along and fossilises everything in the CP/M age for the next couple of decades...
 

mode11

macrumors 65816
Jul 14, 2015
1,303
969
London
Interesting. Never realised it was reverse-engineering the IBM PC's BIOS that was the key to the third party PC hardware market. So it wasn't all down to Bill Gates' nous then. Without BIOS emulation, MS-DOS could never have taken off the way it did. Still, MS-DOS was well positioned to take advantage when that happened.
 

chucker23n1

macrumors G3
Dec 7, 2014
8,564
11,307
Interesting. Never realised it was reverse-engineering the IBM PC's BIOS that was the key to the third party PC hardware market. So it wasn't all down to Bill Gates' nous then. Without BIOS emulation, MS-DOS could never have taken off the way it did. Still, MS-DOS was well positioned to take advantage when that happened.
It's kind of a story of everyone screwing each other over.

IBM didn't want to depend on CP/M any more, so they contracted Microsoft.

Microsoft didn't actually have a CP/M alternative, so they contracted Seattle Computer Products to make a vaguely compatible OS.

Compaq didn't want to depend on IBM, so they reverse-engineered IBM's BIOS to have a vaguely compatible BIOS.

Which means IBM ultimately got a taste of its own medicine.
 

mode11

macrumors 65816
Jul 14, 2015
1,303
969
London
Seems like a gold rush type situation. Once CPUs like the 6502, 8086 and Z80 were around, the desktop market was poised to explode. Sooner or later one standard would emerge as the leader, and be a license to print money.
 

mac.fly

macrumors regular
Apr 8, 2008
110
1
UK
Can someone confirm that the latest iMac has the same quiet fans as the iMac pro had?
(I have a late 2015 iMac and its fans are way too noisy for me.)
 

chucker23n1

macrumors G3
Dec 7, 2014
8,564
11,307
Can someone confirm that the latest iMac has the same quiet fans as the iMac pro had?
(I have a late 2015 iMac and its fans are way too noisy for me.)
It does not. The iMac Pro has different internal cooling that’s non-trivial to do in other iMacs as long as they still offer hard disk options.
 
  • Like
Reactions: ErikGrim

chucker23n1

macrumors G3
Dec 7, 2014
8,564
11,307
Hopefully, the new chips produce less heat, so the fans won’t have to spin up as much.
Even if they use the Intel design again (say, if they want to launch an iMac any moment now), then I imagine it’ll still run a fair amount cooler, while being much faster.

Something like an 8+4 or 12+4 M1X. Probably won’t draw more than 50W. Right now, they have some 125W chips in there (possibly configured down to 95W, not sure).
 

hot-gril

macrumors 68000
Jul 11, 2020
1,924
1,966
Northern California, USA
Yeah. The iMac Pro still had all the same problems at Intel processors have.
You can throw 28 cores at the thing with 256 GB of RAM, but it’s not gonna solve the problem that the processors suck.
when a $1000 MacBook Air with only four performance cores can feel more smooth and Optimized than a $5000 iMac Pro, it’s time to discontinue that sucker
This ignores what the iMac Pro is actually made for, multi-core CPU performance. The real problem with it is the form factor. The thermals were really tight, and you couldn't upgrade anything later. Same guts in a tower would've been a lot better, and that's what the 2019 Mac Pro was, but the iMac Pro was suitable for some professionals anyway.

This also doesn't touch on the GPU. Dedicated pro-tier GPUs are absolutely necessary for some workflows.
 
Last edited:

Yebubbleman

macrumors 603
May 20, 2010
5,789
2,379
Los Angeles, CA
The iMac Pro was meant to be a stopgap product to tide people over until the 2019 Mac Pro came out. The fact that it lasted this long thereafter is kind of ridiculous. Apple Silicon versions aside, I can't imagine that there are terribly many folk for whom either a Mac Pro and/or a 2020 27" iMac won't suffice (and be better).
 

chucker23n1

macrumors G3
Dec 7, 2014
8,564
11,307
It's basically the one big remaining plus of an iMac Pro over a regular iMac — if you have sustained high-performance workloads, the iMac gets rather hot.

Other than that, a ten-core iMac will be 12% faster at single-threaded tasks (and 4% slower at multi-threaded) than a ten-core iMac Pro. It's only if you do a lot of highly-parallelized workloads that the Pro shines (the 18-core iMac Pro will be 48% faster — but, again, only if you actually use up all those cores, which will be rare). But the Pro will be much cooler and quieter.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.