Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
However, I agree that USB-C on mobiles is going to be short-lived as they're on course to become 100% wireless (which makes sense for a phone, and is worth a few compromises).

I think Android will also go this route as well. Many manufacturers have followed Apple removing the analog headphone jack which has pissed off users. I haven’t been tremendously impacted because I don’t listen to music outside that much anymore. It’s either at home through my receiver or in the car. To be honest I’m not a huge fan of 2010s music and I struggle to find new records to listen to. Big ‘80s-‘90s metal and alternative rock fan along with classic blues and jazz.
 
I have gone from a 25 year happy customer and Apple fanboy, to a disgruntled loyal customer, and finally to someone who can't stand their crap anymore and wants out.

Every product I have bought from them over the past 6 or so years has had serious issues. The last decent product I brought from them was my 5,1 Mac Pro - the last great Mac.

The latest victim of my Apple Purchase Nightmare is my P.O.S. £70 wireless rechargeable Magic Mouse II that you can't use while plugged in. It's one year old and the battery is already on it's way out and needs constant recharging. Seventy F***ing Pound! The world does not need a £70 disposable mouse with a 1 year used by date which is inoperable while plugged in to charge.

Nothing is safe from Apple. They have complete and utter disdain and contempt for their customers. I am done.
 
Last edited:
I have gone from a 25 year happy customer and Apple fanboy, to a disgruntled loyal customer, and finally to someone who can't stand their crap anymore and wants out.

Every product I have bought from them over the past 6 or so years has had serious issues. The last decent product I brought from them was my 5,1 Mac Pro - the last great Mac.

The latest victim of my Apple Purchase Nightmare is my P.O.S. £70 wireless rechargeable Magic Mouse II that you can't use while plugged in. It's one year old and the battery is already on it's way out and needs constant recharging. Seventy F***ing Pound! The world does not need a £70 disposable mouse with a 1 year used by date which is inoperable while plugged in to charge.

Nothing is safe from Apple. They have complete and utter disdain and contempt for their customers. I am done.

Sorry to hear this, I now feel like we are being cheated on purpose. That mouse has had bad reviews since the release but they don’t think think the online complaints matter, or even valid.
I won’t buy anything from Apple again. Thankfully other sources for our macs still make reliable products.
 
  • Like
Reactions: Paul A Jackson
The dislike of Windows isn’t recent, it’s been pretty constant since Windows 3.1 at the very least.

Having used windows since 3.1, Windows 8 onwards is the most reviled version(s) of Windows I have ever encountered.

It's the first version since Windows 95 (except for Windows ME) that people are refusing to upgrade to. Even though it was FREE.


Speaking of linux black belts, i've been paid to administer Linux professionally since 1996.
 
  • Like
Reactions: Frank Dalton
You are imagining the dislike of Windows.

Mac forums are an echo chamber.

A lot of the die-hard enthusiasts became embittered when Microsoft shipped bad OSes that you were forced to buy because of their market position.

It never occurred to them that Microsoft could get a new CEO, do an about-face, and start shipping really, really good software.

Since the 90's, most of the world has used Windows. (MacOS has single-digit marketshare.)

And most Windows users are satisfied.

The "everyone hates Windows, Windows is bad software" meme just isn't true the way Mac boosters wish it was true.

My first computer that I owned was a TRS-80 that I bought back in 1979. This was pre-Microsoft, it had it's own proprietary operating system. I helped a friend build a HeathZenith H-88 in 1981, but he wasn't impressed with the computer. I believe it had a version of CP/M for an OS. Next I had my first Apple, the Apple II. It had Apple DOS. My next computer was my favorite line of computers, the Amiga. I bought the original Amiga 1000 in 1986, and an Amiga 3000 in 1989. By far the most advance computer for it's time. In 1986 the Amiga could run multiple versions of itself in RAM, a very early VM, had a sound and a graphic co-processors, and could display 4096 colors. What was called an IBM compatible (what today we would call a Windows PC, but IBM was the more influential name in that collaboration in 1986) computer could display 256 colors. Macs were monochrome machines with a simple speaker. IBM-compatibles could get an add on card to play a tinny 8 bit sound. The Amiga had stereo sound, could multi-task, came with 256K memory and could expand up to 8 meg without requiring memory swapping tricks- all in 1986. Leagues ahead of any other PC. But because of it's graphics and sound it was dismissed as a 'Game Machine'. Back then that was an insult.

From 1992 on I built a number of computers from parts ordered from Computer Shopper magazine. All with various components but all DOS machines. I tried to use Windows 1.0 and 2.0 but they were so buggy that I gave up. So did most other people. I did, like most computer users, install Windows 3.0, then the much more successful 3.1 I don't remember how many floppy disks it took to install but it was a lot. In 1995 I installed OS/2 on my Windows 3.1 computer. OS/2 allowed you to partition the hard drive into 2 separate OS's and choose which one to boot up in at startup. In 1996 I bought my first computer with Windows pre-installed, Windows 95. I then bought a computer with Windows 98, then Windows 2000, and finally Windows XP. There were LOTS of complaints when each one of these were released, for the number of things that were changed from the previous OS and because of programs that quit working. Windows was also a memory hog. If you were willing to run a DOS program you could actually run much faster than running the Windows version. With each version of Windows you would find video and audio programs that wouldn't work because either Windows didn't like the drivers the cards manufacturer provided that was supposedly compatible for Windows Whatever or the manufacturer had gone out of business, or was working on the next version and decided to just ignore all the unhappy customers. From Windows 95 onwards it was really really easy to lock up a computer. The Blue Screen of Death still exists, but from 1995 until 2006 or so it was a frequent visitor, unlike now. Windows XP SP1 and SP2 caused me a lot of overtime at the small company where I worked, as we didn't really have someone in an IT role at work and I knew the most about operating systems, so I spent a lot of time fixing things because either the operator messed with something and caused problems or because it was a day ending in Y.

Windows XP, SP3 was the most reliable and stable Windows version of all the Windows versions released to that time but it wasn't released until 2008. It wouldn't surprise me if there wasn't a fairly large number of computers still running it today because it was extremely stable. But I had my fill of screwing around with Windows and wanted something at home that I didn't need to mess with, and bought my second Apple Computer, a 24 inch iMac in 2008, 26 years after my first Apple. Because I was mostly happy with it (a one button mouse? Who the hell thought that was a good idea?) and because my 1 year old Droid could no longer be updated, and because companies other than AT&T, who to this day has lousy coverage in Nebraska, could finally sell the iPhone I bought my first Apple phone.

If 3 Apple computers (second iMac bought in 2016) and 2 iphones in 36 years makes me an Apple Fan Boi then so be it.
 
I miss my cheesegrater G5. I had wondered if there was some way to transplant the guts of my i7 iMac into the case but it was just a pipe dream more than a serious thought.

Currently my i7 4 ghz is at 32 gigs of ram with the stock 1tb fusion drive boot camped with windows 10 when I need it plus two external displays plus multiple external hard disc drives. And lots of audio interfaces for music production.
 
Many manufacturers have followed Apple removing the analog headphone jack which has pissed off users.

Yes - because removing it does have downsides... E.g. Some people have really expensive or beloved existing headphones. On a flight, I want earphones that can connect to either my mobile or the in-flight entertainment system (I've yet to be on a flight where the connect-your-iPad to the plane wifi to watch movies thing actually existed or worked). Adapter dongles are the last thing you want on a plane (future flights will be falling out of the sky under the weight of Lightning-to-3.5mm dongles lost underneath the seats). There's quite a lively market in GarageBand and other music creation Apps for iOS, but the lag on most wireless headphones makes them a complete non-starter for that (I assume it affects games as well).

...and often "progress" does have downsides. The problem is, when a change is made that has no upside to compensate - and the removal of the headphone jack from the iPhone 7 was a perfect example of that: it wasn't drastically thinner or better looking than the 6, it didn't have better water resistance than competing phones with jacks and the existence of a jack on the 6 wasn't stopping anybody who wanted to use lightning or bluetooth headphones.

Now, if they can say "by eliminating the headphone jack, introducing wireless charging and re-designing the mic and speakers we've been able to totally seal the phone against water and dust" - then we'd have a payoff for losing the 3.5mm jack.

USB on the MacBook pro is similar - they could surely have kept a USB3 port or two without making it drastically thicker (most of the competition has a mix of USB3 and USB-C esp. at MBP 15" level).

However, I see that many Android phone makers are even copying the iPhone 7 "notch/ears" - a messy kludge that only really says "oops - we want to claim an edge-to-edge screen but haven't invented the thru-display selfie camera yet". So maybe we're not dealing with rationality here.
 
Yes - because removing it does have downsides... E.g. Some people have really expensive or beloved existing headphones. On a flight, I want earphones that can connect to either my mobile or the in-flight entertainment system (I've yet to be on a flight where the connect-your-iPad to the plane wifi to watch movies thing actually existed or worked). Adapter dongles are the last thing you want on a plane (future flights will be falling out of the sky under the weight of Lightning-to-3.5mm dongles lost underneath the seats). There's quite a lively market in GarageBand and other music creation Apps for iOS, but the lag on most wireless headphones makes them a complete non-starter for that (I assume it affects games as well).

...and often "progress" does have downsides. The problem is, when a change is made that has no upside to compensate - and the removal of the headphone jack from the iPhone 7 was a perfect example of that: it wasn't drastically thinner or better looking than the 6, it didn't have better water resistance than competing phones with jacks and the existence of a jack on the 6 wasn't stopping anybody who wanted to use lightning or bluetooth headphones.

Now, if they can say "by eliminating the headphone jack, introducing wireless charging and re-designing the mic and speakers we've been able to totally seal the phone against water and dust" - then we'd have a payoff for losing the 3.5mm jack.

USB on the MacBook pro is similar - they could surely have kept a USB3 port or two without making it drastically thicker (most of the competition has a mix of USB3 and USB-C esp. at MBP 15" level).

However, I see that many Android phone makers are even copying the iPhone 7 "notch/ears" - a messy kludge that only really says "oops - we want to claim an edge-to-edge screen but haven't invented the thru-display selfie camera yet". So maybe we're not dealing with rationality here.

Νicely said!!!!!
 
  • Like
Reactions: RandomDSdevel
Having used windows since 3.1, Windows 8 onwards is the most reviled version(s) of Windows I have ever encountered.

It's the first version since Windows 95 (except for Windows ME) that people are refusing to upgrade to. Even though it was FREE.

I also hated Windows 8 more than prior versions but I like Windows 10. I do know many Windows people who walk around complaining about it incessantly though, so certainly not everyone agrees with me.

Speaking of linux black belts, i've been paid to administer Linux professionally since 1996.

What's your impression of Windows 10 with the Linux subsystem?
 
  • Like
Reactions: RandomDSdevel
I also hated Windows 8 more than prior versions but I like Windows 10. I do know many Windows people who walk around complaining about it incessantly though, so certainly not everyone agrees with me.

I actually liked Windows 8.1 enough to prefer having it on my work computer instead of Windows 7. Most of the Metro changes were horrendous, but other changes in the system (native .iso support, shortcuts for launching both elevated and non-elevated PowerShell/cmd in Explorer, better multi-monitor support etc) made my day-to-day work much nicer.

Windows 10 is for my use the best version so far, so it’s a pity Microsoft has tarnished it with the early privacy issues and ads in the lock screen later on.
 
  • Like
Reactions: RandomDSdevel
I also hated Windows 8 more than prior versions but I like Windows 10. I do know many Windows people who walk around complaining about it incessantly though, so certainly not everyone agrees with me.



What's your impression of Windows 10 with the Linux subsystem?

You might not see it, but despite being given away FREE originally and consuming less resources than Windows 7 it has still taken a very long time to displace it to any degree.

For enterprise, it's a joke due to Microsoft's enforcement of "agile" development (i.e., you become our beta testers, like it or not).
 
Apple needs to stop spending their resources on self driving cars and stick to Technology in IT sector.

Maybe I'm mistaken but I think Apple haveabout a hundred thousand employees now. The mystery to me is how they can have so many people working on projects and move so slowly on every technology that they develop. They seem to move at an absolute snail's pace and the quality isn't exactly perfect either. They make huge mistakes with the products that they release (antenna gate, bend gate, keyboard gate).

I'm finding myself less and less patient with their overpriced and outdated technology on the computer side. And their odd decisions on the consumer tech side (headphone jack).

I have many fond memories of using Apple Computers throughout my life. And their platform integration between all the services is still the best there is. But at this stage it's more of a condemnation of how poorly everyone else makes their platforms work. It's inexcusable that Microsoft and Google haven't at least matched what Apple can do in 2018.

For example if I want to post on LinkedIn on my Android phone and decide to attach a photograph I get a freak Show nightmare of file explorer digging through storage to find the photo I want to attach. At first I thought it was because I was using a Samsung device and they have their own Gallery app which competes with Google photos. But on the pixel I'm using right now I have the same issue. So the file system integration between cloud and local storage is terrible in comparison to Apple which is absolutely seamless.

One more example. Say I want to send you all the photos I took today there's only two ways I can do that on an Android phone; I can either send you the original files (all 100 megabytes of them) or I can send you a link to my personal Google photo library. The latter of which is maybe something I don't want to share with you for countless reasons. I did a quick online search to see what I was missing and the Android expert on the forums suggested going to your desktop and exporting the photos from Google photos to a new folder on your desktop and then reimporting the smaller-sized versions into a new album. Yeah I'll get right on that.

Unlike apple who gives you the option when you send a bunch of images to reduce the file size simply and elegantly. My coworker insists the feature used to be in Android but agrees that it's no longer there. Google wants you trapped in their ecosystem it's not something that embellishes an already great product. They want you prisoner.

Most Android users live with these limitations and never even notice them - it boggles my mind!

I guess what I'm saying in this long-winded post is that all the technology companies right now would be wise to focus on fewer products and make them as perfect as possible.
 
Last edited:
Let's look at this: my wife's i7 iMac from 2009 is now starting to exhibit issues with the power supply (that horrid buzzing noise that presages failure before long). It had been replaced once when it was under AppleCare.

It's been out of warranty for quite a while now so I talked to my local computer repair depot today to discuss if it was worth repairing - and they explained that they seldom accept repairs for any computer that old (going on 9 years) as well as the fact that they do not have the power supply in stock to replace it and would need to source it from a third party (if they can find one) and then do the repair. The extra cost would potentially make it untenable. To replace it with a current equivalent is much more expensive than it originally was (I think it originally was originally about $1700 - current models are pushing $2500+ if she wants the i7 cpu). I expanded that machine to 16 gigs of ram too.

I can get her an i5 iMac with current 5k display but with the hardware being from last summer's refresh (or an even older 2015 refurb) the prices are better - she gets a decent 5k display and with a base model fusion drive the price is less overall but I wonder if the performance would be up to par for some of the more demanding software she uses (she does voiceover work professionally and uses a lot of audio software to do edits and cleaning of background noise etc).

It also would be not so great to buy her something new only to have a refresh appear in a month or two.

I'm at a loss at the moment. I feel like we just keep nursemaiding her current machine along until it dies.
 
  • Like
Reactions: RandomDSdevel
Maybe I'm mistaken but I think Apple haveabout a hundred thousand employees now. The mystery to me is how they can have so many people working on projects

Most Apple employees are retail.

and move so slowly on every technology that they develop. They seem to move at an absolute snail's pace and the quality isn't exactly perfect either. They make huge mistakes with the products that they release (antenna gate, bend gate, keyboard gate).

Engineering doesn't scale that way. You can't just make it go faster by adding more people to the problem.

As for the pace and quality, I think it compares just fine to the rest of the industry.

I guess what I'm saying in this long-winded post is that all the technology companies right now would be wise to focus on fewer products and make them as perfect as possible.

I do think Apple may be spreading themselves a little too thin.
 
Let's look at this: my wife's i7 iMac from 2009 is now starting to exhibit issues with the power supply (that horrid buzzing noise that presages failure before long). It had been replaced once when it was under AppleCare.

It's been out of warranty for quite a while now so I talked to my local computer repair depot today to discuss if it was worth repairing - and they explained that they seldom accept repairs for any computer that old (going on 9 years) as well as the fact that they do not have the power supply in stock to replace it and would need to source it from a third party (if they can find one) and then do the repair. The extra cost would potentially make it untenable. To replace it with a current equivalent is much more expensive than it originally was (I think it originally was originally about $1700 - current models are pushing $2500+ if she wants the i7 cpu). I expanded that machine to 16 gigs of ram too.

I can get her an i5 iMac with current 5k display but with the hardware being from last summer's refresh (or an even older 2015 refurb) the prices are better - she gets a decent 5k display and with a base model fusion drive the price is less overall but I wonder if the performance would be up to par for some of the more demanding software she uses (she does voiceover work professionally and uses a lot of audio software to do edits and cleaning of background noise etc).

It also would be not so great to buy her something new only to have a refresh appear in a month or two.

I'm at a loss at the moment. I feel like we just keep nursemaiding her current machine along until it dies.

Apple's current lineup of iMacs are very nice machines all around, despite what some may say in these forums. That being said, if you can limp along until September/October, I certainly would try. I believe Apple will be releasing updates for both the iMac and the MacBook Pro that contain Intel's latest 8th-Generation CPUs. The core count for these CPUs (Coffee Lake) have been increased, from 2 to 4 for the i3 and from 4 to 6 for the i5 and i7(6cores/12 threads). The GPUs in the current iMacs are not bad at all, unless you are a gamer or do heavy duty video production work.

If your wife's current iMac bites the dust, I would highly recommend either the 21.5" iMac with the 3.4GHz CPU or the 3.6GHz i7 upgrade, upgrade it to the 16GB of RAM (unless she has to have 32GB) and upgrade it to the 256GB or 512GB Flash Storage. The caveat here is that those upgrades (core i7, 16GB RAM and 256GB SSD) put you right at $1999.00 retail.

You might be better off with getting her the 27" i5 with the 3.4GHz CPU, upgrading the RAM yourself (its expensive either way you go) and upgrading it to the 256GB or the 512GB SSD. You end up with the larger screen for not much more money.

Myself, I almost always buy refurbished equipment directly from Apple, save the 15% they take off, and make sure the add AppleCare(+), which on the iMac is reasonably priced. You can find various configurations with Flash Storage or more RAM any given day of the week on Apple's website, but you have to jump on it when you see it as they go fast. Monday and Tuesday mornings or around lunch time EST seem to be the best day/times. YMMV.

The 2017 iMacs are so much faster than the 2009 i7 she has now, I can't imagine that she would have any issues. The Core i7 is probably not even necessary as the 3.4GHz i5 (i5-7500) in the 27" iMac is easily twice as fast as the Core i7(i7-860) in the 2009 iMac. Also, upgrading means she can update to macOS Mojave once her audio software catches up (most vendors are very conservative before they update to support the latest release of macOS).

I focus on the i5 or the i7 in the iMac 21.5" as opposed to the higher-end i5 and i7 in the 27" as the 3.4GHz i5 and 3.6GHz i7 are 65w CPUs and, anecdotally, the fans don't tend to spin up nearly as often as the do on the 3.8GHz i5 or 4.2GHz i7 (95w TDPs) in the higher-end configs of the 27" iMac. Less fans = less noise for voiceover, depending on her studio setup. I'm not sure why you would be worried about performance, if she is currently able to get the job done with a 2009 iMac? Unless the app itself has become a big, bloated mess that eats CPUs and RAM for breakfast. Most voiceover work doesn't demand a ton of plug-ins or tracks the way typical music production does.

Also, flash storage would be faster, size shouldn't be an issue, depending on whether or not she is, or can start, using external storage to keep the audio files and libraries off the macOS System drive. Logic Pro X is said to benefit from this, but I don't know about other applications. I would recommend against the Fusion drive at this point, but if you get one, make sure it's at least the 2TB version, as the 1GB version gives you a paltry 24GB of flash storage versus 128GB for the 2TB and 3TB version.

USB 3.0 storage is fairly plentiful and cheap and should be fast enough, especially some of the USB-C flash drives like the SanDisk 900 or the Samsung T-series SSDs. Best Buy is forever running sales on these drives. Wait until Black Friday, if possible.

If you can do it, I would recommend waiting. If not, you do have solid options. No, they are not the latest and greatest, but they will get the job done for another 9-10 years. Good luck!
 
I actually liked Windows 8.1 enough to prefer having it on my work computer instead of Windows 7. Most of the Metro changes were horrendous, but other changes in the system (native .iso support, shortcuts for launching both elevated and non-elevated PowerShell/cmd in Explorer, better multi-monitor support etc) made my day-to-day work much nicer.

Windows 10 is for my use the best version so far, so it’s a pity Microsoft has tarnished it with the early privacy issues and ads in the lock screen later on.

THat's the odd thing about Windows8(1), it was by far, one of Microsoft's BEST functioning OS's

but the UI changes were absolutely horrendous for users. the core though, the underlying performance and functionality. The stability. Were some of the best changes in Windows I have ever used (Which I've used every single released version both corporate and personal). WIndows 8 was rock solid. Absolutely able to take a beating with what was running on it and keep going.

But the UI just turned everyone off so much

The absolute worst windows ever written? That's a toss up between ME and Vista. those cores and kernels were so absolutely horrendously unstable. Overbearing requirements for performance, and just downright blasphemies to technology.

yet more people hate win8 because of the UI
 
THat's the odd thing about Windows8(1), it was by far, one of Microsoft's BEST functioning OS's

but the UI changes were absolutely horrendous for users. the core though, the underlying performance and functionality. The stability. Were some of the best changes in Windows I have ever used (Which I've used every single released version both corporate and personal). WIndows 8 was rock solid. Absolutely able to take a beating with what was running on it and keep going.

But the UI just turned everyone off so much

The absolute worst windows ever written? That's a toss up between ME and Vista. those cores and kernels were so absolutely horrendously unstable. Overbearing requirements for performance, and just downright blasphemies to technology.

yet more people hate win8 because of the UI

Well said! For me, ME was the worst.

Also, regarding 8, after all these years MS should have known that you can’t force people into changing the UI so abruptly going from one version to the next - it’s something that takes time for people to adapt and usually 2-3 versions of slow transition....
 
Well said! For me, ME was the worst.

Also, regarding 8, after all these years MS should have known that you can’t force people into changing the UI so abruptly going from one version to the next - it’s something that takes time for people to adapt and usually 2-3 versions of slow transition....
Wow! ME is from 18 years ago (year 2000). I don't even remember wat kind of MacOS we had back then.
Anyways, people don't seem to complain about the Windows 10 UI so much anymore ;)
 
  • Like
Reactions: RandomDSdevel
What's your impression of Windows 10 with the Linux subsystem?

from my side, as someone whose also admin'd linux in various forms since the 90's, I love that it's there, but I still feel that it's a half ass job.

it's a sandboxed "container", which isn't bad for many uses, but I'd love if it had far far greater integration with the core of the OS so that I could essentially use linux style bash programming for deep control over windows.
[doublepost=1530295964][/doublepost]
Wow! ME is from 18 years ago (year 2000). I don't even remember wat kind of MacOS we had back then.
Anyways, people don't seem to complain about the Windows 10 UI so much anymore ;)


They did an ok job at scaling back the force of changes. the most disturbing thing was that the "desktop" felt secondary to the "Metro" UI. Where from a standard user perspective, who have used windows since the change to current paradigm in 95, was a dramatic shift.

Windows was windows because of the start MENU overlaid ontop of a WIMP styled desktop platform. Windows 8 was too dramatic departure by expecting that WIMP was gone and that full screen was the future.

Windows 10 gets less complaints because it went back to the original Paradigm of Start menu on top of a desktop with WIMP based navigation, while providing the "full screen UI"as an option for those who wanted to move that way.

Windows 10 still has a bit of an issue with settings vs control panel. But that's going to be a long painful road to go down. The Control panel itself is an immensely powerful set of tools, that as an admin, i loath to give up. Settings might be great for your average user, but it does not have everything that control panel can do
 
Windows 10 gets less complaints because it went back to the original Paradigm of Start menu on top of a desktop with WIMP based navigation, while providing the "full screen UI"as an option for those who wanted to move that way.

Windows 10 still has a bit of an issue with settings vs control panel. But that's going to be a long painful road to go down. The Control panel itself is an immensely powerful set of tools, that as an admin, i loath to give up. Settings might be great for your average user, but it does not have everything that control panel can do

I agree. With every Win10 update MS is putting more stuff in the 'modern' settings panel, but still is reluctant to remove the 'old' control panel. We'll see where that ends ;)
 
  • Like
Reactions: RandomDSdevel
from my side, as someone whose also admin'd linux in various forms since the 90's, I love that it's there, but I still feel that it's a half ass job.

it's a sandboxed "container", which isn't bad for many uses, but I'd love if it had far far greater integration with the core of the OS so that I could essentially use linux style bash programming for deep control over windows.
[doublepost=1530295964][/doublepost]


They did an ok job at scaling back the force of changes. the most disturbing thing was that the "desktop" felt secondary to the "Metro" UI. Where from a standard user perspective, who have used windows since the change to current paradigm in 95, was a dramatic shift.

Windows was windows because of the start MENU overlaid ontop of a WIMP styled desktop platform. Windows 8 was too dramatic departure by expecting that WIMP was gone and that full screen was the future.

Windows 10 gets less complaints because it went back to the original Paradigm of Start menu on top of a desktop with WIMP based navigation, while providing the "full screen UI"as an option for those who wanted to move that way.

Windows 10 still has a bit of an issue with settings vs control panel. But that's going to be a long painful road to go down. The Control panel itself is an immensely powerful set of tools, that as an admin, i loath to give up. Settings might be great for your average user, but it does not have everything that control panel can do

Yes yes! Excellent analysis, spot on!
 
  • Like
Reactions: RandomDSdevel
The absolute worst windows ever written? That's a toss up between ME and Vista. those cores and kernels were so absolutely horrendously unstable. Overbearing requirements for performance, and just downright blasphemies to technology.

yet more people hate win8 because of the UI

I’d say ME, as it was based on the highly unstable 9x core. Vista had some performance issues, but it brought along much-needed changes to the security model and it also was the unfortunate messenger that got shot while telling the developers ”guys, you really can’t assume everyone’s an admin anymore. This time it’s for real and heavily enforced”. Also the overzealous UAC (which gained more sensible settings in 7) didn’t actually help, but I’d say Vista was still better than its reputation.
 
Well said! For me, ME was the worst.

Also, regarding 8, after all these years MS should have known that you can’t force people into changing the UI so abruptly going from one version to the next - it’s something that takes time for people to adapt and usually 2-3 versions of slow transition....

How did you even get to use Windows ME?

I guess you must've seen it work in the seconds that it stayed up between plug n play driver crashes!
 
We bought a "new" iMac yesterday, as the 2011 one finally gave up the ghost. The 2011 had a Thunderbolt 2 SSD that ran as system drive, and the computer was nice and fast(ish). Obviously, I thought, six years later the Macs must have become SO MUCH FASTER!

Yes. You guessed. 1 TB Fusion Drive. (512 GB SSD was 450 euro more!!!) I can't connect the Thunderbolt 2 SSD to USB-C, and moving the same drive to USB-A enclosure gives me 40 MB/s read/write speeds due to USB 3.0. But once everything was said, done, and consolidated from two drives to one, there was only one thing that took me a long time to set up:

WINDOWS. (in VirtualBox)

After a while the iMac figured out more or less what to put on its five kB SSD, and now only takes twice as long to start up as the 2011 one did. Everything works OK, programs launch at very acceptable pace. Except Windows and everything in it, because the VM file is 64 GB, and obviously can't fit on the 24 GB Flash capacity of the Fusion Drive. But the bigger problem was Microsoft's shamelessness. 'This machine's processor is 7th generation. That means it's too new. Get Windows 10.' We were forced to install Win 10 because the machine is too new. Even when I bought new Macbooks that were not supposed to run the old system, a Time Machine restore gave me a nice and shiny install of El Capitan. The Win 8.1 VM is, well, a VM, the drive image is the same as it was, but the processor is too new. I can't with that.

I'm awaiting #dongles so I can connect the SSD again, but I'm also marvelling over how shameless Microsoft is. Computer is too new to receive security updates. Win 8.1 is still being updated, worked perfectly well, it's not like it's some sort of ancient system that was originally compiled for Commodore 64. And this sort of thing is example #5958 why despite of 'sad state of Macintosh hardware' I want to stick with a Mac, despite the insane storage prices.

How did you even get to use Windows ME?

I guess you must've seen it work in the seconds that it stayed up between plug n play driver crashes!
ME was the worst operating system ever. I had that on my work computer for about three weeks before demanding Windows 2000. I didn't know it was possible to write something that crashes so often!
 
  • Like
Reactions: RandomDSdevel
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.