And go where? To Windows where the manufacturers sail on the brink of bankruptcy!
That's the big problem, the alternatives still aren't all that great. And... they've gotten much better in the last few years. Just goes to show how huge the gap once was.
Here's the point... if current trends continue, we'll have to eventually switch, or deal with Apple's new vision (i.e.: get used to iPad Pros or really anemic 'Macs' if we're lucky) and put up with much of frustration we came to the Mac to escape.
If Apple can build a quad-core A10 Fusion chip for a phone, it can build an ARM chip suitable for Macs.
Yes, this is potentially doable. And, I suppose they could put tons of cores in them. It would be a huge problem for power-hungry single-thread applications though. But, I suppose eventually, they'll get there (if that's the goal).
I want two things. 1) a renewed committment to "it just works"... I value seamless, reliable cloud based device integration and its not perfected yet. 2) extreme longevity regarding os and application compatibility with my ageing hardware.
That's the problem, Apple has moved away from both counts there. The quality, especially software, has dropped. While it might never have 'just worked' it now 'just works' a lot less. And, Macs don't last nearly as long as they used to, due to forced obsolescence (from hardware and software).
Apple is going to dump Intel for ARM.
Possibly, I'm just not sure I see why. I guess it gives them a bit more control over the features and roadmap, etc. But, it seems Intel or some other company would be making them anyway, so it isn't like it's totally in-house. Intel seems to be doing what they can. I suppose it could be argued ARM is a better architecture in some ways... but poses one of the big problems the transition to Intel solved (unless direct compatibility with the rest of the PC world is no longer a concern).
My macbook air 2011 has recently started to struggle with simple tasks, like finder operations. It is also runnig more hot, which means more fan noise.
Second machine is a Mac Mini server 2011. I have upgraded it to 8GB ram and swapped drives to SSD. ... But even if a 2014 mac mini will outperform the 2011 server ...
That seems a bit odd for the MBA, but maybe you're doing something in particular or something is going wrong with it. I'd avoid Chrome (and Flash) and Adobe products as that's what seems to drive my wife's MBA most nuts.
Is that mini server a quad-core? If so, the current ones will be slower. Hang on to it.

I was actually considering a mini for a home media (and extra processing) server, but the new ones are kind of lame. I'd actually buy a used quad-core at this point, but I'm nervous about how long until Apple cuts support.
Personally I am not interested in the kind of ARM laptop you're describing. I might as well buy a tablet to do all that. ...
In my opinion, as I explained before, ARM will never overtake x86 in any real desktop and laptop environment in the foreseeable future. It's not only the straight speed, where currently there's a world of difference. It's peripherals and buses, which are essential for storage and networking, where the difference is even bigger.
Note, there's a big difference in terms of UX/UI between a laptop and tablet that would have a huge impact on workflow. It's not as easy as Cook makes it sounds unless what you do is limited to browsing, email, and a few other simple tasks (which, is what many do with their computers).
Good points, though, on x86 vs ARM. That's not a simple move.
Just curious - those saying that they're tired of waiting and are jumping ship, what are you moving to? Windows?
I guess, yes, that's where we'd have to move at the moment. Not a pretty picture either, but at some point, it might be worth it. I'm not there yet, but need to be considering these things down the road, while hopefully waking Apple up so they can move the rudder and avoid the iceberg.
Not saying that it's not completely valid, but Apple is likely to shoot for the sweet spot where the vast majority of users live.
Exactly, which is part of the problem. The old Apple recognized the importance of some of the niches, and the impact they have on the whole eco-system. But, besides that, they are also slipping in UX, quality control, and making some other rather daft decisions. And, those, will eventually even impact those sweet-spot customers.
you're only saying what you believe to be true instead of discussing what you know to be true.. such as your own exact experience when using and/or working with a computer.
Ok, I'm not sure where you're trying to go with this. I've worked in IS/IT for over 25 years now, part of that in a Fortune 100. I spent a good bit of that time working with an industrial design firm, where I did all their CAD and 3D rendering. So, I know a thing or two about it from both perspectives.
pro apps vs non pro apps has nothing to do with it.. the process determines if it's suitable for multiple core usages.. not the pro-ness of the application.
Yes, but it's usually the 'pro' apps and the 'pro' users who care most about this and spend the money on the high-end equipment.
problem is, you can count the number of processes which can do this in a noticeable manner on your fingers... very few processes can break down into individual little equations that calculate independently of each other then rejoin at the end..
Yes, but that's the stuff a lot of people who want powerful computers do... whether that's 3D folks or video editors, or home users encoding their Blu-ray library. Even many games now take advantage of multiple cores (I just wanted a YouTube video the other day where a guy was seeing if he could make a high-end single-core work well with modern games, and it didn't do all that well).
'stuff' ?
what do you mean 'stuff'...
Well, for example, I'm a contributor to Folding@home. Unless I'm doing something really mission critical, I usually have it running in the background at about 50%. I don't even notice it's there, yet I'm contributing to the research. Or, if I'm encoding a Blu-ray, with more cores, that wouldn't impact my other work, yet still get done quickly. Or, I could run a render job and assign that to several of the cores, while the rest are available to use in the CAD app to do some modeling, or I could be working in the on setting up the next scene while I wait for the render results.
i don't know how to get you to stop and say to yourself "ok, i don't actually know what i'm typing to be true.. i'm not a professional doing 3D rendering so maybe i should quit using it for my examples"
Ok, you caught me. I haven't done any rendering for a couple of years now as I've been focused on other things and my animation software is incompatible with El Cap. That's pretty easily resolvable should a job come along, or my focus change.
But, I do follow the industry, and do some CAD from time to time. I'm pretty sure the 3D industry hasn't changed that much. If your software can't use multiple cores and RAM, I can recommend some other apps for you.
what, exactly, are 'some of the apps'... when you feed them more cores and ram, what are you physically doing and what state are the computers in? are you personally working faster or finishing your day quicker with more cores in a computer?
Absolutely. I've been using Renderama on multiple computers since before there were even multi-core machines. It will use AS MANY cores and nearly as much RAM (and as many computers) as I can feed it, and yes, IT WILL get the job done more quickly if I do. Fortunately, there are now cloud services like Amazon Cloud computing where I could rent whatever computing capacity I'd like, if I wanted to.
And, absolutely, I can continue working on other projects/software while using local cores and RAM for rendering. Again, if you can't, I'd recommend checking some other software out.
let's say i'm a pro and do 3D rendering.. and let's say you're not a pro and don't do 3D rendering.
You can say whatever makes you feel better.
i am a professional computer user and i do 3D modeling and renderings.. for money. for my livelihood.. for my passion.
Well, then you realize the importance of cores and RAM, or you need some IT/software advice.
Well, my iBook G4 just died yesterday so I am in need of a new computer. As "outdated" as the Mac mini is, it is still faster than my 933MHz PowerPC G4...
Oh, it's not so much that the mini is a bad machine... it's more that it went backwards from where it was a few years ago. I'd recommend an iMac, though in terms of price/performance if you're going desktop. The mini is insanely expensive for what it is, now.
Also, I might be a bit paranoid, but I really do not want my computer "listening" to me. While I am thinking about it, how hard is it to disable the computer from automatically sending things to iCloud, or any cloud for that matter? I do not want the computer sending anything, to anywhere, at anytime unless I specifically command it to. The same goes for things being received on my computer.
Yea, not really, but welcome to modern computing.

Some of security holes being plugged recently would allow those with knowledge of the holes to watch/listen to you, for example. You'd have to physically disable the mic/cameras to be truly safe.
I guess if you do want the functionality of Siri, I'd trust Apple as far as any other such service. But, yea, I just don't need such a feature. As for the cloud, it's hard if you want to use the computer with other devices. The whole OS is setup to talk to Apple these days. There's constantly 'phone home' type stuff going on. You'd almost have to just disconnect from the Internet if you don't want that. (i.e.: It is NOT at all like it was 5-10 years ago when you knew when you were dealing with online vs local.)