Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Thanks for wrapping up the entire boring PC history. The facts remain that Apple leads the PC industry in profit share, holds a 90% share of the ARM-based PC segment, is the only brand which still grows its share every quarter and Big Blue has left the PC market entirely. Being an innovator instead of a monopolist payed off in the long run. And again, nobody cares if penny pinching managers are happy with their cheap Chinese junk. Users prefer Macs and they pay up for the best computers.
And Apple was less than 6 months away from bankruptcy in 1997. Don't forget that.

Also, you don't understand the business world. Most business Lenovos are not cheaper than Macs - in fact, many people have been shocked when you tell them how expensive their ThinkPads are, they're like "but you could get a Mac for that". A T14s ThinkPad with an i5, 8 gigs of RAM, 256 gig SSD is ~$2100CAD nowadays, add $400 for a warranty upgrade, $300 for the docking station. If I take my $2100CAD to Apple, let's see... I can get a 13" M2 MBP with 8 gigs of RAM and 512 gigs of storage.

You buy a business Lenovo for a couple reasons:
¡) good, on-site support. If something breaks on a business Lenovo/HP/Dell, they'll send a technician with a replacement part to your office the next day. And if you have 50-100 laptops or more in a business, something will always be breaking. And traditionally, the machines are more modular - if the SSD is separate, for example, then if the motherboard dies, the technician replaces the motherboard, you keep the SSD, re-enter the BitLocker key, and you're done, whereas if the SSD is built onto the motherboard, now you need to reimage or otherwise re-set up the software on the machine.
Also, they've offered for ~20 years accidental damage warranties, so you avoid the HR politics when someone drops their laptop and shatters their screen. (Apple now has accidental damage warranties, but that's relatively new)
Apple now seems to have some forms of AppleCare for Enterprise, but there's a minimum 250 units required. And when you call normal Apple support, they treat you like a moron, just like every other consumer technology company, whereas if you call Lenovo business support, and you sound like an IT professional and you say "so this laptop is doing X and Y. I did troubleshooting steps A, B, and C, and I think it's likely that part E is bad", they will arrange for a replacement part E to be sent to you. Apple will tell you to wipe the device. (In peak pandemic times, I tried to get a moody iPad replaced by going through phone support. It was an excruciating experience. The only way to get good support from Apple is in-person at the genius bar.)

ii) Windows platform has a lot more corporate-friendly management tools than the Mac traditionally had. For most businesses, that's important. And those management tools haven't needed much rethinking since Windows 2000 and the deployment of Active Directory, whereas if you want to introduce managing Macs, you need to rethink everything.

And these tools all tie into the Microsoft file sharing servers, the Microsoft email platform (Exchange/O365), etc.

That being said, as the world moves more towards mobile/cloud/etc, and you're replacing old-fashioned management tools with MDM platforms like Intune, integrating Macs becomes a lot easier.

iii) Lots of businessy software that's Windows-only. That's changing now, a lot of the newer things are for Chrome instead of for Windows now.

iv) Nice docking solutions... although those are dead now. Apple invented docking with the Duo, and then abandoned it. The businessy PC vendors had nice, easy-to-manage, simple, foolproof docks that corporate IT types happily paid big money for until they lost their minds a few years ago and went all USB-C.

I like Apple as much as the next guy. I have 3 Macs at home. My parents are now both on Mac. But I've also ordered dozens of ThinkPads for work and I can assure you I was not being a "penny-pinching manager" buying "cheap Chinese junk." In fact, I've had people's jaws drop when I tell them how much is spent on a ThinkPad or, for that matter, a good reliable business-grade monitor with a good warranty. I've also had people be surprised that their ThinkPad was the first non-junk Windows machine they'd ever encountered.
 
Last edited:
I mean that is part of the issue, right? If I go into one of the few universities that still have a computer store attached to the bookstore, you have the plastic dells and lenovo bottom of the line machines, and next to them a full metal MacBook Air. If I'm the kids' parent, the shiny metal one seems more sturdy, and just looks nicer. So yeah, I'll probably get them an air or a 13 inch pro. The richer kids get the 16 inch pros because only the best, even if it's never used. It's not that you can't get a nice looking PC, it's just that the recommended models tend to be pretty low rent. I miss the days of Apple really excelling at the OS, and amazing apps, but those days are long gone. I think OS X is marginally better from a user standpoint than Windows, but it's not night and day like it used to be. But damn if my students all don't have to have their 1500 dollar phones the week they come out.
So, I've been out of post-secondary education for... way too long. Close to 15 years.

In 2001, I know the campus computer store option was a Toshiba on the lower end, and then for people with "more demanding" needs, it was some kind of a ThinkPad for $2500CAD. There might have been a second Toshiba configuration in the middle too, too. And... I don't remember any Mac options, though I presume that computer store would have been an approved Apple education retailer (aren't they all?)

If the campus computer stores are now selling junky worst buy-style Windows machines rather than ThinkPads and Latitudes and whatnot, then... good heavens.

(And I guess the other question is - are those computer stores even still a thing? Back in 2001, I think most people had a single "family computer" at home, so when they went off to post-secondary education, that included a stop at the campus computer store to buy the recommended computer. But I presume that's not how things have been for a long time now... )

And honestly, I will recommend Macs to many non-techie people simply because it's difficult to go wrong buying a Mac. Other than maybe the low-end Macs being a little lower than I'd like on RAM (and that's less of a worry with Apple silicon), you can walk into the Apple store, walk out with anything, and get something functional. And something that's not going to have absurd limitations like a 1x1 wifi antenna, a 100 megabit Ethernet port, etc. If you send someone to Worst Buy to buy a low-end Windows laptop, they can find a horribly unusably slow piece of junk next to something perfectly functional. Both with similar external enclosures. And the difference in price, depending on sales, etc, could be as little as $50-100.

And the flip side of that is service. If you have a problem with a Mac and you're clueless, you can go to the Apple store and get good reliable service. If you have a problem with a Windows machine, then... I wouldn't trust the geek squads or similar to provide half-passable, honest service. You don't have the businessy support you get from businessy Windows machines. So the Windows machine requires much more of a knowledgeable friend/family member/etc to maintain...
 
Being an innovator instead of a monopolist payed off in the long run.
Also, I forgot to respond to this.

Apple was the company that behaved as a monopolist in the late 1980s. If you look at what happened during peak Gasseeism, they introduced new machines at ever-higher prices because they thought that they had a monopoly in certain markets (e.g. graphic design) and those markets would pay the $11K for a IIfx. Apple basically tried to ignore Moore's law - if you were looking for a Mac in summer 1990, you were still being offered a 68000 1 meg SE/Plus at the same price (or higher) than in 1987, and if you wanted something nicer, well, hope you enjoy opening up your wallet for a IIci ($8800USD launch price).

And even when they started repudiating Gasseeism with the launch of the Classic/LC/IIsi in 1990, they were still behaving like a monopolist trying to slice and dice Macs to price discriminate as much as possible. A good number of the models of the early 1990s were hobbled simply because Apple wanted to make sure that professional buyers would still buy the $2000 more expensive model. So no FPU here, a few MHz slower clock speed there, etc. Gasseeism was predicated on the idea that i) the Mac had an inalienable advantage for certain industries, ii) people in those industries would continue to pay monopoly prices for IIfxes and Quadras and Power Macs 8100s, and iii) what happened in the market for lower-priced computers didn't really matter to Apple's long-term success (which turned out to perhaps be the greatest mistake of all - in the long-term, network effects and economies of scale matter a lot more than Mr. Gassee, or his counterparts at Sun, SGI, and other deceased workstation vendors might have thought). Meanwhile, the professional apps were getting ported to Windows. Photoshop reached Windows in 1993, Illustrator existed on Windows since 1989, QuarkXPress reached Windows in 1992, PageMaker technically existed for Windows since 1987, etc. And, not coincidentally, as that software made its way to Windows, the price of high-end desktops Macs aimed at those industries started and kept plummeting, at least until the 2013/2019 Mac Pros.

Behaving as a monopolist (at least after losing their lawsuit against Microsoft) is what caused Apple to lose the GUI PC war and nearly go bust in 1997. And while you are starting to see a few teeny spots of Gasseeism in the iPhone lineup (e.g. the SE, the use of last year's chip in the 14, etc), one of the key factors in their success in the 2010s in particular is that they didn't act like a monopolist in that way.
 
Most business Lenovos are not cheaper than Macs - in fact, many people have been shocked when you tell them how expensive their ThinkPads are, they're like "but you could get a Mac for that".
And I'm flabbergasted that people buy them at all.
A T14s ThinkPad with an i5, 8 gigs of RAM, 256 gig SSD is ~$2100CAD nowadays, add $400 for a warranty upgrade, $300 for the docking station. If I take my $2100CAD to Apple, let's see... I can get a 13" M2 MBP with 8 gigs of RAM and 512 gigs of storage.
I'd buy neither of them. $1500 should be the absolut max for a normal PC. Nobody should buy an i5 laptop in 2023 anyway. x86 is obsolete technology. Who wants to burn their lap and have no battery life?
I like Apple as much as the next guy. I have 3 Macs at home. My parents are now both on Mac.
That's not a lot of Macs. I only have 4 between me and my father and could use 1 or 2 more.
In fact, I've had people's jaws drop when I tell them how much is spent on a ThinkPad.
Because they can't believe someone would waste their own money on cheap junk. And by cheap I obviously mean build quality, not what you've overpaid for.
I've also had people be surprised that their ThinkPad was the first non-junk Windows machine they'd ever encountered.
If it has Windows on it, it is junk by definition. You should ask yourself, why do people act shocked and surprised when you tell them about your purchase decisions? Do they think your reasoning is a bit crazy?
 
  • Haha
Reactions: bobcomer
Apple was the company that behaved as a monopolist in the late 1980s. If you look at what happened during peak Gasseeism, they introduced new machines at ever-higher prices because they thought that they had a monopoly in certain markets (e.g. graphic design) and those markets would pay the $11K for a IIfx.
You confuse luxury prices with monopolistic behavior. Microsoft is a monopolist because they actively seek to undermine the business of potential competitors. Apple didn't hinder anyone to build a better, cheaper computer for graphic design. Instead they priced it higher according to its value for their business customers and used the profits to develop even better hardware and software, which only extended their competitive advantage in the field of graphics and design.

 
MacBook − Windows − ThinkPad
iPhone − Android − Galaxy
Mercedes − Toyota − Lexus

In many markets you have the original inventor of the product, which over time becomes synonymous with build quality and continuous innovation and commands the highest luxury price points. And then there is the less desirable copycat brand, which often holds the biggest market share simply because it is a lot cheaper.

Out of envy the maker of the cheap knockoff inevitably gets the idea to create his own luxury version of the cheap knockoff and ask for the same kind of money as the luxury brand. Why anyone would fall for this strategy is beyond me? A product with the price of a luxury brand and the appeal of a cheap copy, should never find a customer. Of course you're supposed to buy a MacBook Pro, if you pay anywhere near as much!
 
And those monitors will always, always be "normal"-resolution, not something high-resolution like Apple's retina displays.

With all the legacy software in Windowsland, the recent popularity of VDI solutions like Citrix (which may be running on servers with older Windows OSes), etc, reasonable business IT people simply do not want the potential headache of monitors that require scaling. Not to mention the potential additional complexity of having different monitors requiring different scaling settings.

And that is why there are no retina-grade monitor options on the market other than a few "4K" monitors that don't really fit (a 4K 28" monitor at 1920x1080 doubled is going to have some seriously big huge text).

One big thing, too, the last few years has been the ultra-wide 3440x1440 monitors. Note how they're a "dated" resolution but one that doesn't require any scaling... which I am sure is a big part of their popularity.
Do scaled resolutions on 4k 27" monitors really create such an issue with Windows that it's a problem for IT? No idea myself, but given how (relatively) commonplace those have become in the Windows world (they're essentially commodities these days, which is why they can be purchased relatively economically), I'd be surprised.

I've not found that text size is an issue on 4k 27" (the standard size for 4k) at 2:1 scaling on Macs, since I simply adjust the Zoom in my various apps to get the text size I want, as I do on all monitors. It's only the UI's you can't adjust, and those take up such a small percentage of my screen real estate that it has little significance for me. And to the extent it does have an effect, I prefer it, since it means wider scroll bars and larger close and full screen dots, which makes them easier to target quickly when I'm working rapidly.

[I run all three types of displays we're discussing side-by-side on my Mac, so I'm very familiar with all of them. My main monitor is a 5k 27" Retina, my RHS monitor is 4k 27", and my LHS monitor is an extended-HD (1920 x 1200) 24" (see details in my sig line).]
 
Last edited:
And I'm flabbergasted that people buy them at all.

I'd buy neither of them. $1500 should be the absolut max for a normal PC. Nobody should buy an i5 laptop in 2023 anyway. x86 is obsolete technology. Who wants to burn their lap and have no battery life?

That's not a lot of Macs. I only have 4 between me and my father and could use 1 or 2 more.

Because they can't believe someone would waste their own money on cheap junk. And by cheap I obviously mean build quality, not what you've overpaid for.

If it has Windows on it, it is junk by definition. You should ask yourself, why do people act shocked and surprised when you tell them about your purchase decisions? Do they think your reasoning is a bit crazy?
Have you ever worked in a business? And in a business that wasn't started up in the last 3 years, i.e. that has some amount of legacy software, data, etc in systems that management spent big money for, that staff is trained for, etc?

The idea that "x86 is obsolete technology" is laughable. The overwhelming majority of software that is used to make money in most industries runs only on x86/amd64. And most of the business software that doesn't run on x86 runs on, oh, I dunno, IBM zArchitecture, IBM i-formerly-AS/400, etc. Not sure if there's too much VMS still out there.

I shouldn't need to tell you this, but the purpose of IT in every organization is to provide computer systems that help that business make money at whatever that business does. If you work for Boeing in IT, your goal is to provide systems that can be used to design, support, etc commercial and military airplanes. If you work for Coca Cola, your goal is to provide systems that can be used to manufacture soft drink syrups, oversee bottling plants, distribute finished beverages. Etc. The software that helps these businesses make money is overwhelmingly written for x86 (either Windows or, for client-server systems, some form of Linux/Unix on the backend) or IBM architectures.

I don't want to make ageist comments, but your perspective sounds like someone who has no appreciation of 'legacy' systems and their prevalence in the world around you. Go to a store and buy something using a credit card - your payment will be processed using mainframes running IBM zArchitecture. If you think x86 is obsolete technology, what does that make zArchitecture? With its perfect 100% compatibility with software written all the way back in the mid-1960s? And flip the light switch in your room - do you think the power plant that supplies your electricity has control systems running on ARM, or anything else developed in the last two decades?!

There is a reason that x86 is popular in the business world and elsewhere. There is a reason that IT departments 'happily' spend money on expensive Windows laptops or desktops. It is not stupidity. Frankly, when you think people with decades of experience are doing things out of "stupidity", chances are, if you want to see stupidity, it is between your chair and your keyboard. Just because you have no real world experience doesn't mean that the people who see value in what you view as obsolete are wrong.

Frankly, you might as well go and call up, say, a construction company and tell them that they're stupid to buy pickup trucks when a Toyota Prius gets way better fuel economy and costs one third as much. They will laugh at you and tell you that it's impossible to carry the materials they need for their work in the back of a Prius. Just because you think a Prius is a higher-tech, more environmentally friendly vehicle does not mean that there isn't a place for one-ton pickup trucks in the construction industry. And if you are too closed-minded to see it, that's on you, not on them.

Also - I'm presuming that I'm a bit older than you, because I remember when everybody was excited about RISC this, PowerPC that, etc. Go and find a Mac magazine from Oct. or Nov. 1991 when the AIM alliance was announced, all the excitement about how amazing PowerPC was going to be, how it was the future, etc. Then realize that, 15 years later, Apple started selling systems that are functionally/architecturally IBM compatibles (I suspect, though I have never tried, at least the first generation of Intel Macs with "Boot Camp" are capable of booting MS-DOS; they can certainly BIOS boot Windows XP). And then you start to look at the world a bit differently - why did this exciting platform turn into a complete flop 15 years later, while Windows (which was a joke in 1991, two years before the first version of NT would ship) running on an architecture that everybody considered outdated went on to run the world?

And in 1991, if you were a kid like me, you encountered vague references of big systems - UNIX, VAX, AS/400, RS/6000, Alpha, Silicon Graphics, etc - in magazines. Big systems that people used for Serious Serious Work and that were talked about in Serious Places, not magazines you could buy for $4. And guess what - x86, and to a lesser extent Windows NT, basically ate all those big systems. The architecture that everyone considered a joke in 1991 (with its 640K memory contortions, etc) and two operating systems that didn't exist in 1991 (Windows NT and Linux) somehow ate all the unmentionably-big systems.

If someone, in 1992, had said that they wanted to port some Big Serious Software that ran on *NIX workstations or IBM systems to Windows on x86, any 8-12 year old kid who read magazines would have been like "you're an idiot. x86 is dead. RISC FTW! And Taligent!" (Remember that Windows NT ran on a bunch of exciting RISC architectures, too... all of which promptly flopped. And as for Taligent, I don't think their operating system went much beyond the excited magazine articles.) And yet... 10 years later, essentially almost that Big Serious Software was now exclusively running on NT or Linux on x86 machines.

Excitement (and excitement over perceived technical merit) does not guarantee long-term success; in fact, if you took a magazine like, say, Byte (first publication, 1975) and went back over everything they wrote about, looked at how those technologies ended up doing, etc, I would guess that most of the things they were excited about ended up flopping. And meanwhile, x86, Windows, etc. just kept going and going. If you had told people in 1981, reviewing the original IBM PC with its 8088, that descendents of that architecture would not only dominate the nascent microcomputer market, but would swallow the minicomputer market and make a serious dent into mainframes, they would have laughed at you.

In its issue on the 10th best car engines of the 20th century (https://www.wardsauto.com/news-analysis/10-best-engines-20th-century), Ward's Auto described the GM/Buick 3.8L engine as "the poster child of a bad idea turned good through fastidious refinement." That, in my mind, would describe the x86/IBM PC architecture as well - it was a bad idea originally, it remained a bad idea, it's still considered a bad idea 40 years later, yet somehow through fastidious refinement (largely by Intel and Microsoft, with some contributions from Compaq, AMD, Linus Torvalds and others along the way), it has basically dominated everything else and delivered absolutely unbeatable performance-per-dollar. In fact, in 40 years of it being considered a bad idea, it is only in the last, oh, 3 years, and only after Intel lost their manufacturing edge to TSMC, that other alternatives have shown ANY prospect of supplanting it. And so far, primarily in laptops. Not sure where ARM-for-servers efforts are at; x86 and its companion GPUs from NVIDIA/AMD is still holding its own in desktop chips. Dismissing it now, only a few years after the first serious alternative shows up, seems rather hasty to me.

Honestly, go into a business, any business of more than, I dunno, 50 employees, rip out every x86 system, rip out every IBM legacy system, etc. (That includes, naturally, cancelling/replacing every cloud service powered by x86 systems as well - if x86 is obsolete, surely having x86 software running in Microsoft's datacenter instead of yours is not acceptable.) Your boss better have the bankruptcy lawyer's number on speed dial, because he/she will need it once you have destroyed the business' ability to make money and/or sent it back to the 1930s and/or spent 20 times the business' annual profits on new systems and run out of money before those new systems are ready.

One other point, which again reveals your lack of real world experience: you don't appreciate the benefits of sticking with things that work. If your company has been using software X for 20 years, your staff is trained on how to use software X to do their jobs, your IT department knows how to support software X, how to assist staff with the regular problems seen with software X, etc, then who cares if software X runs on an OS that arguably isn't the best and on a processor architecture that arguably isn't the best? Or even if software Y might be a little bit better in the abstract, and might be the choice you would make if you started the company from scratch today? Do you really think that any reasonable boss will be like "oh, yes, let's rip out software X, migrate all the data to software Y, retrain all the staff on software Y, spend 6 months figuring out the quirks of software Y when we already know the quirks of software X, throw out our Windows hardware... and all this because ARM Macs are the best CPU architecture, x86 is obsolete, and Lenovo is "cheap Chinese junk"?! No - the reasonable boss will rip out the idiot who thinks this is a good business decision, i.e. you, and keep running software X on Windows/x86 ideally until his/her retirement and long after.

And that's what you don't seem to understand when talking about things like heat and battery life: the fact that a Lenovo laptop can run 8 hours on battery while an Apple silicon laptop can run 18 hours is irrelevant in the business world. The fact that it heats your lap more is irrelevant. The fact that the Lenovo laptop is compatible with your already-paid-for management systems (because, in a business, centralized management of computers is critical), that that laptop runs the software your business is based on, and that your staff is trained on how to use that software on that OS on that machine is highly relevant - because, frankly, changing any of those three things will cost tens, if not hundreds of thousands of dollars (or more), and introduce all kinds of risks. (If you've been in a business the first week new technology is rolled out, you will know what I mean...) And for what...? Double the battery life? Lower temperatures? Nicer screens? If you tell a boss that you want to rebuild half the company's IT infrastructure and retrain all the staff to double battery life on laptops that are run on battery 5% of the time, any reasonable boss will ask you what the price tag of an external battery pack for the Lenovo laptops is and tell you to buy one for everybody who complains about battery life. And if you don't understand why that boss is right and why you are wrong, well... you have a lot to learn about the corporate world. You can start learning it today or you can learn it from your first boss or two.

Always, always, always remember one thing if you intend to make a career in IT - the purpose of IT is to support the company/organization's operations and to do so as efficiently and invisibly as possible. Not to build some kind of abstract showcase of the best/trendiest/etc technology.
 
Do scaled resolutions on 4k 27" monitors really create such an issue with Windows that it's a problem for IT? No idea myself, but given how (relatively) commonplace those have become in the Windows world (they're essentially commodities these days, which is why they can be purchased relatively economically), I'd be surprised.

I've not found that text size is an issue on 4k 27" (the standard size for 4k) at 2:1 scaling on Macs, since I simply adjust the Zoom in my various apps to get the text size I want, as I do on all monitors. It's only the UI's you can't adjust, and those take up such a small percentage of my screen real estate that it has little significance for me. And to the extent it does have an effect, I prefer it, since it means wider scroll bars and larger close and full screen dots, which makes them easier to target quickly when I'm working rapidly.

[I run all three types of displays we're discussing side-by-side on my Mac, so I'm very familiar with all of them. My main monitor is a 5k 27" Retina, my RHS monitor is 4k 27", and my LHS monitor is an extended-HD (1920 x 1200) 24" (see details in my sig line).]
I think part of the problem is that people in IT don't want to try. People have seen glitchy scaling on Windows 5-10 years ago and just don't want to bother again. If someone wants a 27" with big pixels, I can buy the 1920x1080 model, plug it in, run it at 100% scaling in Windows, and not worry about it. If I wanted to do the same thing with a 4K 27" (which, in all fairness, might be a similar price), I just do not know how it will behave at 200% scaling, what software might be happy, what software might not, what adjustments might be needed, etc. So... what is the motivation for buying something and trying it out? It's not like the 4K monitor has any benefit that an ordinary business user will likely notice. So you just stick with the tried-and-true resolutions out of conservatism.

My sense is that the market for 4K 27" monitors is largely gamers. And, frankly, if you have the GPU to back it up (i.e. render the game at native 4K), I think that makes a ton of sense for gaming.

My go-to resolutions for Windows desktops are - 24" 1920x1200 (also a big fan of those, despite the fact that most of the monitor industry chopped the bottom 120 pixels and branded the resulting monstrosity "Full HD" a little over a decade ago), 27" 2560x1440 or 1920x1080 (depending on people's eyesight) and 34" 3440x1440.

One thing to note about monitors - there is a dramatic difference between what is offered to consumers (e.g. by brands like LG, Samsung, etc) and what is offered to businesses (who typically want a monitor branded Dell/Lenovo/etc and backed by those organizations' service infrastructures). What is offered to consumers uses HDMI interfaces, often has external power supplies (a major no-no in the business world if you ask me), and often has "weird" resolutions; what is offered to businesses uses DisplayPort, has nicely adjustable stands, has built-in power supplies and USB hubs, and sticks to fairly classic resolutions. And, of course, good warranties. Dell, for example, continues to sell a 1920x1200 24" monitor, as do HP and I think Lenovo, when I haven't seen that resolution at Worst Buy in 10 years.
 
MacBook − Windows − ThinkPad
iPhone − Android − Galaxy
Mercedes − Toyota − Lexus

In many markets you have the original inventor of the product, which over time becomes synonymous with build quality and continuous innovation and commands the highest luxury price points. And then there is the less desirable copycat brand, which often holds the biggest market share simply because it is a lot cheaper.

Out of envy the maker of the cheap knockoff inevitably gets the idea to create his own luxury version of the cheap knockoff and ask for the same kind of money as the luxury brand. Why anyone would fall for this strategy is beyond me? A product with the price of a luxury brand and the appeal of a cheap copy, should never find a customer. Of course you're supposed to buy a MacBook Pro, if you pay anywhere near as much!
You don't seem to understand.

Lexus' success is not due to being a "cheap knockoff" - it is largely due to the fact that it, along with Toyota, has an impeccable reputation for reliability and correspondingly-low cost of ownership.

Most people buy cars for transportation. If your car breaks down on the side of the road, that is a problem. If your dash lights up like a Christmas tree and needs a visit to the dealer/mechanic, that is a problem. That is TIME out of your day that you're not spending at work, with your family or friends, etc but rather dealing with a product and a problem that you would prefer not to have to deal with. People want a car that they can get into, start it, and it works, every single time every single day, and gets them to their destination.

And, once the product is outside warranty, you also have to look at the cost of the repair. If the Lexus has part X at the top of the engine, so it takes the mechanic an hour of labour to replace it, and the part is half as much money, whereas Mercedes located that part in an awkward hard to reach location that requires your mechanic to take apart stuff and reassemble it for 6 hours, and the part costs twice as much, then... guess what, your bill to repair the SAME failure on the Mercedes will be 2.5X what it was on the Lexus. And, of course, that's assuming the equivalent part is equally (un)reliable on both - if the Lexus one also lasts twice as long before needing replacement, then you save again.

Most people would rather have a vehicle that needs to be in the shop less frequently and that costs less money when it does have to be in the shop rather than a vehicle that gets better reviews, is more fashionable, or has slightly fancier technology they may not even notice. Even in the so-called luxury market, at least until you get to exotics, Bentleys, etc. which no one expects to be reliable or practical.

(And, it's worth noting, that is equally true of Lenovo laptops - they will need to be 'in the shop' less than MacBooks, and when they do need repairs, well, a technician showing up at your office and swapping a part for 10 minutes is something that Apple cannot match. AFAIK, Apple doesn't even offer onsite service... so... for example, let's say a user spills liquid into their laptop keyboard. On the Mac, you've destroyed the keyboard, the logic board, etc. Needs a trip to the Apple store and a couple of days of downtime. On the Lenovo, there are little channels so the liquid flows directly from the keyboard to the outside of the laptop, saving the motherboard, so you've only destroyed the keyboard, which the technician will come to your office and swap out in 10 minutes. Now, multiply this by 5000 laptops in a business, and that adds up to serious savings in person-hours. )
 
My sense is that the market for 4K 27" monitors is largely gamers. And, frankly, if you have the GPU to back it up (i.e. render the game at native 4K), I think that makes a ton of sense for gaming.
I don't have market data, but I think it would be the opposite.

I believe there are two classes of 4k monitors. Most I'm familiar with are in the same family as my Dell P2715Q, and were bought not for gaming, but by those who wanted improved text sharpness and/or higher resolution for photo or video editing. They've been around at accessible prices (~$500 or less) since about 2015, and are not suitable for gaming because they are limited to 60 Hz and don't have fast response times.

Then there are the gaming 4k monitors. I'm not a gamer, but I'd guess these remain relatively uncommon, since they're typically >=144 Hz, and relatively few gamers have cards capable of running games using 4k@144Hz.

What is offered to consumers uses HDMI interfaces, often has external power supplies (a major no-no in the business world if you ask me), and often has "weird" resolutions; what is offered to businesses uses DisplayPort, has nicely adjustable stands, has built-in power supplies and USB hubs, and sticks to fairly classic resolutions.
Not so. All the externals monitors I've purchased as a consumer have DP interfaces, USB hubs, internal power supplies, standard resolutions and nicely adjustable stands. Indeed, the only monitor I have with a somewhat "unusual" resolution is the WUXGA (1920 x 1200) I got from work.
 
4K 27" monitors are extremely popular for both multimedia content creation and for business productivity. Gamers are a mix, but 1440p monitors have a big share there, due to high refresh rates, and lower resolution (meaning higher frame rates). 2560x1440p monitors can support 165 Hz on typical machines.

Anyhow, my business productivity monitor has an external power supply and a 3:2 aspect ratio at 3840x2560, and I connect via USB-C (although it also has both mini-DisplayPort and HDMI). However, more typically they would be 16:9 3840x2160.
 
Not so. All the externals monitors I've purchased as a consumer have DP interfaces, USB hubs, internal power supplies, standard resolutions and nicely adjustable stands. Indeed, the only monitor I have with a somewhat "unusual" resolution is the WUXGA (1920 x 1200) I got from work.
Where have you bought those lovely monitors? I have not seen any monitors with all those things at the Worst Buys and similar - obviously, you can order the "businessy" ones from Dell, Lenovo, etc. without being a business, but I have simply not seen these stocked in any consumer-centric stores.

I have had the occasional request from people being like "I want a second monitor to... please tell me what I can pick up at worst buy" and whenever I've looked, the only things I could find at WB were gamery monitors with 144Hz, styling that looks more appropriate for a teenage kid's room than a 35 year old's home office, etc or low-end, HDMI-only, external-PSU, etc models with non-adjustable stands. Very depressing.

Also very depressing is that someone who is looking at that site, sees those lousy monitors for CAD$200-250, etc then seems shocked when I suggest a monitor with those niceties in the same size for CAD$500.

(Also, calling 1920x1200 'unusual', while true, leaves me grumpy. 1920x1200 was a completely standard resolution for about 2-3 years in the late 2000s, seemed like it was going to become the new standard to replace the first-gen 17/19" 1280x1024 LCDs, then some genius MBA decided that they could cut 10% of the pixels off, market it as "full HD" to people who don't understand computer resolutions are/should be much higher than HD TV, and laugh all the way to the bank. Same thing happened on Windows laptops - you could get laptops with 1920x1200 screens in the late 2000s, then all the businessy Windows laptops went to 16:9. Apple offered a 1920x1200 monitor for years, too, didn't they? Funny thing is, Dell has continued to offer/upgrade/etc one or two businessy models in that resolution for the intervening 10 years - I presume you and I are not the only ones buying them, given that they continue to offer them and, indeed, modernize them - the U2421E has all the modern USB-C/docking/etc features...)
 
Have you ever worked in a business? And in a business that wasn't started up in the last 3 years, i.e. that has some amount of legacy software, data, etc in systems that management spent big money for, that staff is trained for, etc?

The idea that "x86 is obsolete technology" is laughable. The overwhelming majority of software that is used to make money in most industries runs only on x86/amd64. And most of the business software that doesn't run on x86 runs on, oh, I dunno, IBM zArchitecture, IBM i-formerly-AS/400, etc. Not sure if there's too much VMS still out there.

I shouldn't need to tell you this, but the purpose of IT in every organization is to provide computer systems that help that business make money at whatever that business does. If you work for Boeing in IT, your goal is to provide systems that can be used to design, support, etc commercial and military airplanes. If you work for Coca Cola, your goal is to provide systems that can be used to manufacture soft drink syrups, oversee bottling plants, distribute finished beverages. Etc. The software that helps these businesses make money is overwhelmingly written for x86 (either Windows or, for client-server systems, some form of Linux/Unix on the backend) or IBM architectures.

I don't want to make ageist comments, but your perspective sounds like someone who has no appreciation of 'legacy' systems and their prevalence in the world around you. Go to a store and buy something using a credit card - your payment will be processed using mainframes running IBM zArchitecture. If you think x86 is obsolete technology, what does that make zArchitecture? With its perfect 100% compatibility with software written all the way back in the mid-1960s? And flip the light switch in your room - do you think the power plant that supplies your electricity has control systems running on ARM, or anything else developed in the last two decades?!

There is a reason that x86 is popular in the business world and elsewhere. There is a reason that IT departments 'happily' spend money on expensive Windows laptops or desktops. It is not stupidity. Frankly, when you think people with decades of experience are doing things out of "stupidity", chances are, if you want to see stupidity, it is between your chair and your keyboard. Just because you have no real world experience doesn't mean that the people who see value in what you view as obsolete are wrong.

Frankly, you might as well go and call up, say, a construction company and tell them that they're stupid to buy pickup trucks when a Toyota Prius gets way better fuel economy and costs one third as much. They will laugh at you and tell you that it's impossible to carry the materials they need for their work in the back of a Prius. Just because you think a Prius is a higher-tech, more environmentally friendly vehicle does not mean that there isn't a place for one-ton pickup trucks in the construction industry. And if you are too closed-minded to see it, that's on you, not on them.

Also - I'm presuming that I'm a bit older than you, because I remember when everybody was excited about RISC this, PowerPC that, etc. Go and find a Mac magazine from Oct. or Nov. 1991 when the AIM alliance was announced, all the excitement about how amazing PowerPC was going to be, how it was the future, etc. Then realize that, 15 years later, Apple started selling systems that are functionally/architecturally IBM compatibles (I suspect, though I have never tried, at least the first generation of Intel Macs with "Boot Camp" are capable of booting MS-DOS; they can certainly BIOS boot Windows XP). And then you start to look at the world a bit differently - why did this exciting platform turn into a complete flop 15 years later, while Windows (which was a joke in 1991, two years before the first version of NT would ship) running on an architecture that everybody considered outdated went on to run the world?

And in 1991, if you were a kid like me, you encountered vague references of big systems - UNIX, VAX, AS/400, RS/6000, Alpha, Silicon Graphics, etc - in magazines. Big systems that people used for Serious Serious Work and that were talked about in Serious Places, not magazines you could buy for $4. And guess what - x86, and to a lesser extent Windows NT, basically ate all those big systems. The architecture that everyone considered a joke in 1991 (with its 640K memory contortions, etc) and two operating systems that didn't exist in 1991 (Windows NT and Linux) somehow ate all the unmentionably-big systems.

If someone, in 1992, had said that they wanted to port some Big Serious Software that ran on *NIX workstations or IBM systems to Windows on x86, any 8-12 year old kid who read magazines would have been like "you're an idiot. x86 is dead. RISC FTW! And Taligent!" (Remember that Windows NT ran on a bunch of exciting RISC architectures, too... all of which promptly flopped. And as for Taligent, I don't think their operating system went much beyond the excited magazine articles.) And yet... 10 years later, essentially almost that Big Serious Software was now exclusively running on NT or Linux on x86 machines.

Excitement (and excitement over perceived technical merit) does not guarantee long-term success; in fact, if you took a magazine like, say, Byte (first publication, 1975) and went back over everything they wrote about, looked at how those technologies ended up doing, etc, I would guess that most of the things they were excited about ended up flopping. And meanwhile, x86, Windows, etc. just kept going and going. If you had told people in 1981, reviewing the original IBM PC with its 8088, that descendents of that architecture would not only dominate the nascent microcomputer market, but would swallow the minicomputer market and make a serious dent into mainframes, they would have laughed at you.

In its issue on the 10th best car engines of the 20th century (https://www.wardsauto.com/news-analysis/10-best-engines-20th-century), Ward's Auto described the GM/Buick 3.8L engine as "the poster child of a bad idea turned good through fastidious refinement." That, in my mind, would describe the x86/IBM PC architecture as well - it was a bad idea originally, it remained a bad idea, it's still considered a bad idea 40 years later, yet somehow through fastidious refinement (largely by Intel and Microsoft, with some contributions from Compaq, AMD, Linus Torvalds and others along the way), it has basically dominated everything else and delivered absolutely unbeatable performance-per-dollar. In fact, in 40 years of it being considered a bad idea, it is only in the last, oh, 3 years, and only after Intel lost their manufacturing edge to TSMC, that other alternatives have shown ANY prospect of supplanting it. And so far, primarily in laptops. Not sure where ARM-for-servers efforts are at; x86 and its companion GPUs from NVIDIA/AMD is still holding its own in desktop chips. Dismissing it now, only a few years after the first serious alternative shows up, seems rather hasty to me.

Honestly, go into a business, any business of more than, I dunno, 50 employees, rip out every x86 system, rip out every IBM legacy system, etc. (That includes, naturally, cancelling/replacing every cloud service powered by x86 systems as well - if x86 is obsolete, surely having x86 software running in Microsoft's datacenter instead of yours is not acceptable.) Your boss better have the bankruptcy lawyer's number on speed dial, because he/she will need it once you have destroyed the business' ability to make money and/or sent it back to the 1930s and/or spent 20 times the business' annual profits on new systems and run out of money before those new systems are ready.

One other point, which again reveals your lack of real world experience: you don't appreciate the benefits of sticking with things that work. If your company has been using software X for 20 years, your staff is trained on how to use software X to do their jobs, your IT department knows how to support software X, how to assist staff with the regular problems seen with software X, etc, then who cares if software X runs on an OS that arguably isn't the best and on a processor architecture that arguably isn't the best? Or even if software Y might be a little bit better in the abstract, and might be the choice you would make if you started the company from scratch today? Do you really think that any reasonable boss will be like "oh, yes, let's rip out software X, migrate all the data to software Y, retrain all the staff on software Y, spend 6 months figuring out the quirks of software Y when we already know the quirks of software X, throw out our Windows hardware... and all this because ARM Macs are the best CPU architecture, x86 is obsolete, and Lenovo is "cheap Chinese junk"?! No - the reasonable boss will rip out the idiot who thinks this is a good business decision, i.e. you, and keep running software X on Windows/x86 ideally until his/her retirement and long after.

And that's what you don't seem to understand when talking about things like heat and battery life: the fact that a Lenovo laptop can run 8 hours on battery while an Apple silicon laptop can run 18 hours is irrelevant in the business world. The fact that it heats your lap more is irrelevant. The fact that the Lenovo laptop is compatible with your already-paid-for management systems (because, in a business, centralized management of computers is critical), that that laptop runs the software your business is based on, and that your staff is trained on how to use that software on that OS on that machine is highly relevant - because, frankly, changing any of those three things will cost tens, if not hundreds of thousands of dollars (or more), and introduce all kinds of risks. (If you've been in a business the first week new technology is rolled out, you will know what I mean...) And for what...? Double the battery life? Lower temperatures? Nicer screens? If you tell a boss that you want to rebuild half the company's IT infrastructure and retrain all the staff to double battery life on laptops that are run on battery 5% of the time, any reasonable boss will ask you what the price tag of an external battery pack for the Lenovo laptops is and tell you to buy one for everybody who complains about battery life. And if you don't understand why that boss is right and why you are wrong, well... you have a lot to learn about the corporate world. You can start learning it today or you can learn it from your first boss or two.

Always, always, always remember one thing if you intend to make a career in IT - the purpose of IT is to support the company/organization's operations and to do so as efficiently and invisibly as possible. Not to build some kind of abstract showcase of the best/trendiest/etc technology.
Great rant ;)! I also find myself frustrated with those who blindly dismiss those who use x86, and don't understand the legacy systems that would have to be completed reworked to move away from it.

I myself started with punch cards, then moved to an IBM 360 timeshare, and then VAX.

Having said that, I think the following represents too much resistance to change, since it doesn't recognize that there can be benefit in moving to better tech:
If your company has been using software X for 20 years, your staff is trained on how to use software X to do their jobs, your IT department knows how to support software X, how to assist staff with the regular problems seen with software X, etc, then who cares if software X runs on an OS that arguably isn't the best and on a processor architecture that arguably isn't the best?
Specifically, I'd commend to your attention this summary of a 2019 JAMF presentation by Fletcher Previn, CIO at IBM, showing the results of IBM's decision to give employees the choice of using a Mac or PC starting in 2015.

IBM now supports 290,000 Apple devices. There were three key findings:
1) Employees who use Mac are more likely to stay with IBM
2) Employees who use Mac are more likely to exceed performance expectations
3) They need one IT support employee for every 242 PC users vs. one IT support for every 5,400 Mac users. In part because of that, over a four-year period, there's a savings of $270-$540 per device when an employee chooses Mac rather than PC.

If IBM had followed the philosophy of "who cares if software X runs on an OS that arguably isn't the best", then they never would have found this out.

 
  • Like
Reactions: AlphaCentauri
4K 27" monitors are extremely popular for both multimedia content creation and for business productivity. Gamers are a mix, but 1440p monitors have a big share there, due to high refresh rates, and lower resolution (meaning higher frame rates). 2560x1440p monitors can support 165 Hz on typical machines.

Anyhow, my business productivity monitor has an external power supply and a 3:2 aspect ratio at 3840x2560, and I connect via USB-C (although it also has both mini-DisplayPort and HDMI). However, more typically they would be 16:9 3840x2160.
Does 3840x2560 work nicely in retina 2x mode? I would think it... should, better than 3840x2160 on a 27" which seems horrible to me (I'm just picturing a 27" 1920x1080 retinified... and... comparing it to the 5K iMac I am typing this on. Love my 5K iMac.)

As for external power supplies, I have no issues with those for home/home offices. But for a work environment, I think the likelihood of those power supplies getting lost is too high...
 
Where have you bought those lovely monitors? I have not seen any monitors with all those things at the Worst Buys and similar - obviously, you can order the "businessy" ones from Dell, Lenovo, etc. without being a business, but I have simply not seen these stocked in any consumer-centric stores.
https://www.bhphotovideo.com/ :).

Granted, that meant I had to rely on reviews rather than seeing them in person. These days, pretty much the only place you can see high-end monitors in person are at Apple stores and other Apple retailers (like university bookstores). [The Microsoft stores and/or Micro Center may or may not have 4k monitors on display.]

Also, a common refrain from those reviewing the current generation of 27" 4k Dell monitors is that they're not as good as the Dell P2715Q I have.
 
Last edited:
Having said that, I think the following represents too much resistance to change, since it doesn't recognize that there can be benefit in moving to better tech:

Specifically, I'd commend to your attention this summary of a 2019 JAMF presentation by Fletcher Previn, CIO at IBM, showing the results of IBM's decision to give employees the choice of using a Mac or PC starting in 2015.

IBM now supports 290,000 Apple devices. There were three key findings:
1) Employees who use Mac are more likely to stay with IBM
2) Employees who use Mac are more likely to exceed performance expectations
3) They need one IT support employee for every 242 PC users vs. one IT support for every 5,400 Mac users. In part because of that, over a four-year period, there's a savings of $270-$540 per device when an employee chooses Mac rather than PC.

If IBM had followed the philosophy of "who cares if software X runs on an OS that arguably isn't the best", then they never would have found this out.

In an organization the scale of IBM, I completely agree - they can experiment in a way that a small organization cannot. And if the data is promising, then great! And it also depends what those people's work software is - if they're working entirely with cross-platform tools, or if they're heavily using VDI/remote resources, then what you have on your desk matters a lot less. If they needed to use some legacy Windows tool to do their job, then enabling them to do that while having a Mac... requires some effort (or replacing the legacy tool). Effort that's easy to justify on IBM's scale. :)

(And what is also interesting is the causal relationships. e.g. is it using a Mac that makes it more likely to exceed performance expectations? I doubt it, but I am no HR professional. Or is it that high performers are more likely to pick a Mac for whatever reason? In which case, supporting/integrating Macs makes sense if only as a retention tool for those high performers.)

I'm looking at it more from the perspective of a smaller organization - if you're in, say, the 25-199 employee range, I don't think you have the staffing to do that kind of experiment... and you definitely are not going to have the buy-in for a wholesale transition from A to B unless senior management has reached a catastrophic frustration point with A. And that's unquestionably one of the reasons why smaller businesses, in my experience, tend to be on the technological trailing edge - they don't have the scale/staffing/etc of large businesses, they do have some specialized/customized/etc things and trying to replace those things would be extremely disruptive.
 
https://www.bhphotovideo.com/ :).

Granted, that meant I had to rely on reviews rather than seeing them in person. These days, pretty much the only place you can see high-end monitors in person are at Apple stores and other Apple retailers (like university bookstores). [The Microsoft stores and/or Micro Center may or may not have 4k monitors on display.]
And... we come full circle. If you want a quality non-Apple peripheral or computer, you basically have to buy it online. You can't see a nice monitor in person at any place either one of us can think of, and I can't think of any place that has nice Windows laptops on display either. It almost feels like the ONLY decent/high-end computer stuff in a worst buy will be the Apple products...

Are there still Microsoft stores in some places? I think all the ones in Canada shut down during the pandemic... which, in some ways, is a shame - while I didn't necessarily agree with all their merchandising decisions, they at least had a reasonable selection of higher-end Windows stuff. And... the other very nice thing - none of their machines had bilingual keyboards. (Something else Apple doesn't do... but most consumer laptops from HP/Acer/etc in Canada are sold with those awful, awful bilingual keyboards. Except the ones sold by the Microsoft store.)
 
Does 3840x2560 work nicely in retina 2x mode? I would think it... should, better than 3840x2160 on a 27" which seems horrible to me (I'm just picturing a 27" 1920x1080 retinified... and... comparing it to the 5K iMac I am typing this on. Love my 5K iMac.)
I see that you're in Canada.

You can sometimes get the 27" LG UltraFine 4K for CA$399 - $435, but it seems it's $499 at the moment, at that's online at Amazon.ca. There's also the 23.7" LG UltraFine 4K but it's actually more expensive. However, for both of those, go 2X-scaled and you're down to 1080 vertical resolution.

My 3840x2560 monitor works beautifully at 1920x1280, and in fact, that is the default because that's 2X scaled. 1280 is a decent vertical resolution, but I don't run that because it's not enough horizontal screen real estate and the fonts are too big. I run 2304x1536. Most typical Mac users seem to run it at 2560x1707 though, and I've seen a few here at MacRumors run it at 3008x2005.

MateView rez USB-C.png


I bought mine at a brick & mortar store, but it was CA$699, and it wasn't a major chain. I bought it at Canada Computers, but yes, you could return it for full refund in the first 2 weeks after purchase. You can also buy it at Amazon.ca ($699) or Best Buy online (direct from the manufacturer, at $899?!? WTF?!!), but not Best Buy in-store.

BTW, at my previous workplace for the IT-provided monitors, it was you get what you get and you don't get upset. However, in some of the departments for those of us who had our own offices, they were perfectly happy to let us buy our own monitors if we wanted... with our own money that is. The institution had thousands of employees but not tens of thousands of employees.
 
There are some badly programmed apps that don't work well with scaling, but if properly developed using Windows API's, it's really good these days.
In the enterprise setting it's not unusual to have very old apps in use. They may work badly on windows. They won’t work on macOS at all.
 
  • Like
Reactions: bobcomer
So it’s funny, I checked just to see how much I was speculating, and our uni recommends either an XPS 13 or a MacBook Air for students. Students in some specific majors have other recommendations, say computer science or biology, but those are the standards. I haven’t seen an XPS in person recently, but I suspect they’re more or less competitive with the apples in build quality. So that begs the question why the apples are so much more popular - among our undergraduates - than the Dells. I don’t have an answer. For years, getting IT to support Macs was like pulling teeth, and they still don’t get it entirely. Sometimes you luck out, but often not so much. I had them get me a trashcan Mac Pro when I started and a 4K display: I was working on a retina MBp and using the lower res screens for a desktop was annoying, and my students were starting to do some higher res projects. They capitulated, but I was largely on my own for support.
 
So it’s funny, I checked just to see how much I was speculating, and our uni recommends either an XPS 13 or a MacBook Air for students. Students in some specific majors have other recommendations, say computer science or biology, but those are the standards. I haven’t seen an XPS in person recently, but I suspect they’re more or less competitive with the apples in build quality. So that begs the question why the apples are so much more popular - among our undergraduates - than the Dells. I don’t have an answer. For years, getting IT to support Macs was like pulling teeth, and they still don’t get it entirely. Sometimes you luck out, but often not so much. I had them get me a trashcan Mac Pro when I started and a 4K display: I was working on a retina MBp and using the lower res screens for a desktop was annoying, and my students were starting to do some higher res projects. They capitulated, but I was largely on my own for support.

I am guessing that we have come to a point where the majority of our work is either done in the browser (eg: google docs, task management, slack), or there's a Mac app available for said task (like zoom). Your final decision then comes down to more intangible factors like preference of OS, or apple silicon vs intel (and the ramifications like battery life), or how entrenched one is in the Apple ecosystem.
 
I did read the entirety of your post, honestly, the way you wrote it could be interpreted in the way I understood it.

My bad – I should've perhaps written "also then able" rather than simply "then able" to avoid the ambiguity.

Now, Apple could’ve given us the option to use RGB, but I assume they didn’t want to confuse people (“Just works” is often at odds with power users) because they would switch to RGB and not understand why they can’t do Wide Color/HDR when attaching the Mac to their TV. You can argue you don’t need HDR, but a lot of people want to hook their Mac to their HDR TVs.

Pretty much this. Having to disable SIP to solve the problem is really not ideal at present. Would love more accessible power user options (which don't arguably compromise the system).

Honestly, if you need a professional external display and care about differences between RGB and YCbCr you shouldn’t be using HDMI anyway. It’s a port aimed at content consumption and many monitors have HDMI ports so that they can be used with various entertainment devices like game consoles or Blu-ray players.

TLDR: this is probably not a bug, but a design decision made by Apple. Use DisplayPort for external displays and HDMI for TVs and Projectors.

I quite dislike HDMI (given it's generally inferior to DP) and wouldn't use it if I didn't have to when running multiple devices with a single monitor where DP inputs are thus occupied. I've tried sorting this out with KVM kits but quality DP1.4a solutions are effectively a myth. (Would gladly appreciate any suggestions here)
 
Lexus' success [in the US] is not due to being a "cheap knockoff" - it is largely due to the fact that it, along with Toyota, has an impeccable reputation for reliability and correspondingly-low cost of ownership.
People who bought the idea that VW is a luxury brand also bought Lexus. The bar for being accepted as a luxury car in the US is incredibly low. That you argue with the low cost of ownership proves that you don't understand what luxury even means. Hint, it has nothing to do with practicality or cost savings. Being prohibitory expensive and therefore exclusive to a smaller group of people is often a staple of luxury. It doesn't mean a litter bit nicer than average. Luxury is a level of perfection at which you don't even ask for a price. And then again you can also buy a reliable Toyota under another name for about the same price! How is that going to work? Probably as good as "luxury" Windows laptop with a good old trusty i5 in it, causing global warming and skin burn, because efficiency apparently isn't a concern for businesses. Yeah right, but that's why businesses should buy a Toyota for a Toyota price, not a Lexus for the price of a BMW.
Most people buy cars for transportation. If your car breaks down on the side of the road, that is a problem.
Most people can not afford a car in their lifetime or at least can't be picky which brand. Most people (including me) would never spend north of $2000+ on a laptop, not even an Apple laptop. The demographic who would spend that amount of money on an i5 ThinkPad are certainly not most people. You would need to be crazy dependent on legacy business software and also not aware that other manufacturers exist.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.