Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
For f***s sake, x86 is a 40 year old crap architecture that should have died decades ago.

x64 isn't 40 years old, and ARM isn't revolutionary. All current CPUs spend 90% of their time doing nothing but burning electricity. The actual time it takes to add or compare two numbers is a tiny fraction compared to fetching instructions, transferring data, etc. A revolution is necessary, a totally new, non-Neumann architecture. For example, an FPGA that can be reprogrammed a million times a second, to execute loops natively from hardware, without fetching each instruction one at a time. Software would only be used to reprogram the hardware. The difference would be like between interpreted and compiled languages. Because a CPU is nothing more than a poor interpreter these days.

However, any serious alternative is in research stage at best, and no one can predict how successful they will be. In the long run, there's no question that all of today's CPUs will go away completely.
 
ARM is only targeting a specific subset of servers. They are not targeting servers in general. The target they are aiming at are in the 'front end' cloud server ( e.g., generic web serve ) front ends ) and highly variable load servers. Think hosted virtual machines where the host company makes money by how many limited throughput 32 bit VMs they can pack onto some hardware. Or having "peaker load" servers than can rapidly spin up to handle load bubbles and then spin back down to extremely low power usage when the load evaporates.

Intel is now targeting the same market with their ATOM server offerings. All this has very little to do with where need full time computational horsepower. In short, in parts of server centers power efficiency was becoming a bigger selling point and ARM saw (and still sees ) an opportunity.

Eg. HP Moonshot.
 
Your concerns are valid, but misplaced.

When I wrote about potential issues with single-threaded performance, I was first and foremost thinking about web applications. JavaScript is essentially single threaded — yes, you can parallelise some stuff with WebWorkers — but most of the code, especially the one which handles the DOM changes, event responses, animation etc., is still going to be single threaded.
True but there are other ways to address the browser performance issues. Install WebKit nightly on an old computer for example here. They now compile JavaScript with an LLVM backend, they are also working on compiling CSS. On my old 2008 MBP the snappiness was very noticeable. Frankly A7 in its current form isn't far from the performance of this machine. By the time this hardware ships I'm not sure many would notice a JavaScript performance issue when upgrading from older hardware.
With modern web applications, which heavily rely on advanced DOM manipulation and model binding (such as things built with AngularJS or EmberJS), the single-threaded performance is the deciding factor in perceiving how smooth the website will be. Of course, it might very well be that an A7/A8 is able to execute all the JavaScript fast enough so that no latency is perceivable.
We already know that A7 is the market leader in JavaScript performance, in its current market segment. So one can only expect that things will get better.
Until this is clear, the single threaded performance remains a concern. As of now, the current MBA is more than twice as fast in browser benchmarks.

Which is my whole point when it comes to running A7 at similar clock speeds. The current AIR can run single threaded apps at pretty high clock speeds until it throttles. The A7 runs at around 1.3 GHz so there is plenty of room to improve things.

The point here is that you can't knock the performance of ARM in a laptop based on the performance you see in an iPhone. I don't see Apple using the cell phone variant in a laptop but rather a slightly tweaked version. Given this you really can't knock it until it ships.

----------

no, it wouldn't. The proportion of the total cost of a mba which goes towards the cpu is small to the point of being irrelevant.

$300
 
It's not flawed. You have to include all the cores. That makes up the whole processor. If Apple can't design a multi-core processor then it'll never catch up to the performance of Intel.

Since you wan't to compare cores here's the core 2 duo against the A7. Still lags behind.

http://cpuboss.com/cpus/Intel-Core2-Duo-T7400-vs-Apple-A7

Apple's A7 and the ARM processor is only good for light tasks. They are good processors for sure but they should never be in anything that requires performance. I can see them maybe put these in the macbook air's but not the macbook pro's where people require performance.

We'll go by your geekbench website. Single core:
Apple A7 is 1418
a core 2 duo is 1341
Core i7 is 3106

Unfortunately that isn't taking the entire processor into account. Which you have to do to be fair.

Multi-core:
Corei7 12949
A7 2548

Even if Apple had a quad core these benchmarks tell me that Apple's quad-core would still only be at 5000 still lagging behind intel.

Multi-core is where it counts because you typically multitask and you do use all the cores. Not so much on the iphone or tablet because you can only do one thing at a time unlike a computer where you have browsers open, iphoto open, backups running, music playing, video editing, writing a word doc, running vmware fusion, etc...

The A7 will never replace intel in apples lineup. It might be a new product line but it'll never replace it.

Your argument is flawed on many levels. First of all, the benchmark scores are wrong. Looking at the Geekbench 3 results, an iPad Air with A7 reaches 1478 points while a 2635QM reaches 2597 in single-core performance. So the difference is 'just' 2x per core. And Apple has doubled the speed of their ARM CPU — at least when we go with Geekbench — within last two years. Intel only managed a 10% increase. Continuing the trend, ARM will outperform Intel within few years ;) Still, another problem with your argument is that you can't extrapolate the current trends like this. These things are not linear.

----------



It is a different tool with a different purpose in mind. Still, what relevance does the discussion of forking have to what I wrote about web application performance?
 
Sure but people on here are getting all heated up that they will REPLACE intel with ARM. It's impossible. The chipset is incapable of the sort of horsepower that intel can do.
Which is baloney! There is nothing about the architecture that keeps it from competing with Intel. In honest comparisons, A7 holds its own against far more robust systems. Given that it does well now there is nothing to keep it from performing much better designed into a similarly robust system.
Arm is a reduced instruction set which is partly why its more efficient and doesn't have many of the more advanced features of x64 processors. There is almost no way to get current high end apps running on a ARM as is.
BS. You can run any app on ARM that you can run on i86. High end apps just require a similar implementation. In other words more RAM and faster ports. There is nothing about ARM that keeps these ports from happening.
It's the same reason that you cannot just have the way more powerful GPUs that AMD and Nvidia run OSX or any other OS. They are not designed the same way. And don't forget it's not actually apple that defines the A7 it's the ARM company. They adapt the design.
Actually there is considerable speculation in the industry that Apple had a lot to do with ARMs 64 bit architecture. They have more than a years head start on working 64 bit hardware for example. As for the GPU, a part that is often underestimated importance wise, Apple has hired a very large number of GPU engineers lately. I wouldn't be surprised to find out that Apple has big plans here.
This is just another balerdash rumour.

Maybe! However I'd love to see an ARM based platform running an open version of Mac OS. I'd jump on the platform immediately, especially if there is a Mini replacement in the works. It has the future written all over it.
 
Let's see ARM pass up Intel CPUs by leaps and bounds and then we'll talk about expanding it to Macs. It's only making progress faster right now because it's already way behind the curve to begin with! An iPad can't be compared to a quad-i7 for goodness sake. Blow me away with performance numbers and we'll talk. Otherwise, people are getting excited over NOTHING.

The future of computing? Hell, if the future of computing is a bunch of people texting on their phones, heaven help us. That's what most people spend most of their time doing on their phones, texting, emailing and buying crap. I don't even own an iPhone since I can buy a new Macbook Pro every other year for what I'd spend on something that would get me in trouble at work (i.e. surfing when I should be working) and frankly, people don't need to know where I'm at and what I'm doing while driving (let alone the growing nightmare of accidents...some even while walking into traffic due to texting, etc.) IT CAN WAIT PEOPLE. Try a real conversation with someone for once in your life. It really scares me that today's kids aren't going to be able to interact on any kind of normal social basis and their thumbs are going to need joint replacements by the time they're 20.
 
Derp.

This is like saying the only thing that keeps McDonalds afloat is their fast food.

I think it's moreso saying that Intel has for the past 7 or so years been a process node ahead of AMD. There have been a lot of times in those 7 years where an intel processor designed for one form factor, would have lost it's performance advantage if it had been made on the same process node AMD was on. Due to the process node advantage, Intel could crank up clock speeds and core counts a little higher for the TDP envelope and didn't have to advance their architecture as much.
Sandy bridge gave about a 10% improvement, not bad, but not really significant. Haswell gave around 6%. That's irritating to enthusiasts like me. We want them to put out some architectures that look like they really tried, rather than do the bare minimum to maintain their market dominance.
By the time intel gets 2 process nodes ahead of competing fabs it would be too expensive on a per chip basis. So it's not economical for them to keep pushing it. Meanwhile the more inferior fabs can stay one node behind and because Apple can make architecture improvements by leaps and bounds, they overcome the process node disadvantage and win on the performance per watt. It looks like they'll win on single threaded performance by the A9, even if we assume their instructions per clock improvements start leveling off.
 
Last edited:
No, it wouldn't. The proportion of the total cost of a MBA which goes towards the CPU is small to the point of being irrelevant.

What is the cost of the current i5 in the base MacBook Air and four A7s? I wonder what four A8s will cost. The article mentions that there would be four to eight A series processors.
 
x64 isn't 40 years old, and ARM isn't revolutionary. ...

It's close to 40 years... the 8086 came out in 1978... and yes the ARM processor isn't exactly young in terms of ancestry either.

Like Windows I see X86 as the best and worst that could have happened to computing, it brought the masses but stifled competition. At one point you had either Intel or an Intel clone, simply because it was the only thing that ran windows and anything other than windows was crap... and imho that was a pretty low benchmark.

For me the success of running a ARM based machine will depend on what current software can I run on it from day one... it's a pretty straight forward measurement... one which I'll use when my current family household of 4 machines is due a refresh.
 
While what you say is true, your basic assumption is non-PC devices won't become more powerful and evolve, just as the PC has. There was a time when laptops were viewed as an adjunct, not a replacement, for much more powerful desktops. Today, laptops are often the only PC a person uses.

Yes. A post-PC device is supposed to be something different. A laptop equipped with an ARM processor is not a post-PC device. It is a PC. The iPad may well evolve and become as powerful as a full-blown laptop, but then will it still be a post-PC device? Won't it turn itself into a PC? Does this post-PC speech makes any sense after all?

As for tablets, when some of the early ones were very limited. The Newton was neat but even in its 2K incarnation it wasn't still an adjunct to a PC, not a replacement. Tablets, while still an adjunct in many cases, have evolved to the point for many things they can replace a PC. I can watch Netflix on on, stream it to my TV if I want., logon to mainframes or websites and do everything I can on my PC.

PowerPoint, OTOH, isn't yet powerful enough to replace a PC running a presentation. Word, as you point out, is best suited for simple tasks or text entry for later use on the PC version.

As tablets evolve you'll see more and more tasks performed primarily on them; just as laptops started repaving desktops.

Tablets are becoming more and more powerful. But they are different. Laptops always run the same operating system as desktops, they were just less powerful. Tablets, they run a different operating system. The iPad is just different. It is supposed to do less.

----------

You make a valid pitch for your use case. But the retort would be that not everyone has/ or wants a PhD. A vast majority of ma-and-pa kettles who buy computers would have even heard of a citation (other than the one that makes them go to court)

In fact even in 2013; only about a third of the US population has even a bachelor's degree - Educational_attainment_in_the_United_States

Yes, and here in Brazil even less people has a bachelor's degree.

That's not the point, though. I just gave an example of how a PC, a real one, can be much more useful, powerful and versatile than any of the so-called post-PC devices which are available. And that the PC is not anachronic.
 
It's not flawed. You have to include all the cores. That makes up the whole processor. If Apple can't design a multi-core processor then it'll never catch up to the performance of Intel.
The current A7 is designed for extremely low power, it is literally going into cell phones. Apple could easily take the same cores double or more their numbers for use in a desktop machine.
Since you wan't to compare cores here's the core 2 duo against the A7. Still lags behind.

http://cpuboss.com/cpus/Intel-Core2-Duo-T7400-vs-Apple-A7
Not exactly a trust worthy site when the page has real technical errors. Given that it tends to support most people's opinions here. Run the A7 at the same clock rates as the current Intel systems with a corresponding RAM array and you have a very competitive platform. This especially considering that more cores can be easily had in the ARM implementations.
Apple's A7 and the ARM processor is only good for light tasks. They are good processors for sure but they should never be in anything that requires performance. I can see them maybe put these in the macbook air's but not the macbook pro's where people require performance.
There is no reason for them to go into MBPs. People are missing the whole point here which is to offer up a new platform which delivers good performance at a dramatically reduced cost.

However your insinuation that the chips is only good for light tasks is at best dishonest. The right way to look at it is that A7 gives incredible performance in a very resource constrained environment. Remove those constraints and you have a different capability.
We'll go by your geekbench website. Single core:
Apple A7 is 1418
a core 2 duo is 1341
Core i7 is 3106
Which is a terrible benchmark to compare cross platform.
Unfortunately that isn't taking the entire processor into account. Which you have to do to be fair.

Multi-core:
Corei7 12949
A7 2548

Even if Apple had a quad core these benchmarks tell me that Apple's quad-core would still only be at 5000 still lagging behind intel.
Actually they tell you nothing because you are working with an ARM variant engineered into an entirely different platform. You have no idea what an A7 like processor can do with lots of faster RAM, and an improved secondary storage system.
Multi-core is where it counts because you typically multitask and you do use all the cores. Not so much on the iphone or tablet because you can only do one thing at a time unlike a computer where you have browsers open, iphoto open, backups running, music playing, video editing, writing a word doc, running vmware fusion, etc...
So true so why do you compare an A7 designed for a cell phone against a top end Intel chip. At the very least Apple would need four cores in the chip, something easy to obtain. Even more important to your benchmarking is RAM and I/O.
The A7 will never replace intel in apples lineup. It might be a new product line but it'll never replace it.

Well it certainly makes sense in the short term to do a new product line. However long term Intels days are numbered. Even if Apple stays i86 they maybe forced to go AMD simply because AMD is very willing to do custom. In the end it is all about what goes on the chip.
 
"Post-PC" is not so much a gimmick, but far more so appealing to notion that most folks don't separate form factors from actual function very well. "Post PC" is far more the notion are going past the notion that a "PC" looks like the classic IBM "box with slots" from the 80's form factor or even the clamshell laptop of the 90's. That it runs DOS/Windows like apps. The personal computer market is going past being stuck with just those two primary form factors and being benchmarked solely on the OS side against Windows. The notion of a personal computer never was suppose to be stuck to a small subset of form factors.

What the term had grown to mean was a x86+Windows+ classic form factor machine.




That isn't what the notion is about. The origins of the notion of Macintosh was as an appliance more so than a malleable box with slots. Has little to nothing to do with anachronism and more so to do with calcification of terminology.



Those limitation of feature length are far more artifacts of the those specific programs rather than the form factor and/or iOS.




Again not particularly an issue of the new form factors and/or platforms that are new.




Not necessarily. Mac OS made two previous transitions and apps worked.
That is more so a question of whether Apple building a scaffolding emulator or not. Personally I don't think they want to put effort into that (they didn't on the last transition.... they bought access to a solution).






It is impossible to write a 250 page document on an iPad plus keyboard?
Footnotes and references are just a matter of software and ease of entry.





This is kind of a chuckle since when Word first came out folks would same similar things of using Word versus the far more capable (especially for dissertations) TeX system (and other word processing programs they might be familiar with. e.g., Wangs text editor etc. ) [ Even early 90's Framemaker versus the then versions of Word would get very similar issues of cross link, indexing, citiations , etc support where Word "couldn't cut the mustard anywhere near as well". ]


Word has about what around 30 years of development time behind it on the Mac and Word for iPad came out with less than one year ago. Yeah sure they are going to be feature equal because the man years allocated to both is about the same. .... Errrrr not.

One of the primary reasons iOS apps with the same name has less features is because they are younger than their siblings. It isn't an issue of cannot/impossible and far more so of not implemented. "I am not able to do this with this specific version" is dramatically different from cannot be done.

If actually trying to doom or implode Word for iPad one highly successful tactic would be to declare not to ship Word until it was 100% completely feature equal to Window for Windows. Saddle the project with unnecessary complexity and watch it implode under the issues that arise with the finite resources available.

"But Word for iPad can't do it right now"..... Guess what? These ARM based laptops and desktops aren't shipping right now either. Software that hasn't shipped yet really isn't a huge issue for hardware that hasn't shipped yet either. When the first iPad shipped some folks commented about how couldn't do Photoshop and Lightroom on them. Several years later that is not entirely true. It isn't about "cannot" (not capable).










Two programs can't possibly work together on iOS. Not really.


The relatively (to modern CPUs/GPUs) computational requirements of generating a 250 page document with references is one of the primary what the classic PC market is stagnating. An A7 (or future A8) SoC has all the horsepower need to accomplish the task. Can trot out some exhaustive software feature set laundry list and start quibbling over some relatively small subset that is missing but "not capable of " is laughably unmotivated.
The problem the industry is trying to get to grips of is that hardware is far more capable than these limited mainstream workloads.

Can't do localize real time text to speech or some high computational horsepower app perhaps. However, do what software was doing in the early 90's? That is just an implementation priority issue not a capability issue.

Well, all you said is right, but only from a theoretical perspective.

I cannot write a 250-page document on my iPad. No iPad software supports cross-references or other advanced features. Yes, Word and even Pages may improve a million times and become even more powerful than their counterparts available for PC/Mac. But they may never evolve as well. It is a matter of software, but the software has to be written, and it takes several years of development for a piece of software to achieve such levels. There is no reference manager for the iPad as well, nor integration with Word. iOS can evolve and be capable of it. Of course it is a possibility. Theoretically, everything is possible.

The thing is, I cannot wait 5 or 10 years for the iPad to possibly become powerful enough to do the tasks I am required to do. I need the features now. And the iPad does not offer them now. And I don't know if it will ever offer them. We can only guess based on past experiences. But the world is a very different place right now than it was in the 1990s. Everything can happen.
 
It's close to 40 years... the 8086 came out in 1978... and yes the ARM processor isn't exactly young in terms of ancestry either.

Like Windows I see X86 as the best and worst that could have happened to computing, it brought the masses but stifled competition. At one point you had either Intel or an Intel clone, simply because it was the only thing that ran windows and anything other than windows was crap... and imho that was a pretty low benchmark.

For me the success of running a ARM based machine will depend on what current software can I run on it from day one... it's a pretty straight forward measurement... one which I'll use when my current family household of 4 machines is due a refresh.

You are confusing x86 and x64.
 
Wait a minute - if they would do the transition to ARM, wouldnt they loose Thunderbolt? And Apple cant make a new connector in the next couple years.
 
Actually they tell you nothing because you are working with an ARM variant engineered into an entirely different platform. You have no idea what an A7 like processor can do with lots of faster RAM, and an improved secondary storage system.

So true so why do you compare an A7 designed for a cell phone against a top end Intel chip. At the very least Apple would need four cores in the chip, something easy to obtain. Even more important to your benchmarking is RAM and I/O.

On top of that, a rumor was out that Apple cranked the clock speed up to 2.6Ghz. They probably wouldn't do that for the iPhone and iPad chips because it's less efficient, but they could do it for MacBooks and iMacs because there's plenty of battery capacity to do it with. So that puts single core Geekbench results at about 2600. That's competitive enough and it's not taking into consideration that Apple will be making improvements to the architecture.
 
Of course Apple is testing OS X on their A-series processors. I'm sure they are actively testing iOS on Intel, too. I also imagine they have some AMD devices floating around their labs.

In regards to Apple dumping Intel for their laptops and desktops, I don't think that really jives with Intel's CEO's comments just a few months ago:

http://appleinsider.com/articles/14/02/19/intel-ceo-says-relationship-with-apple-remains-positive-companies-are-growing-closer

Maybe it is just the CEO's wishful thinking, but I would think his comments have more credibility than a "reliable source" at some French Apple news/rumors site that doesn't have much of a track record.
 
An interesting thing to note is that each of these 4-8 chips could get 8GB/s of ram bandwidth (based on the A7 design). If Apple designs it to, then you could have 32-64GB/s of bandwidth. That's insane. The new Mac Pro has 60GB/s.
 
Last edited:
Not likely. OS X is a more advanced OS. It would be the other way around. OS X integrates iOS and you load OS X on your iphone and switch between modes.

With every new version of OSX, looks more and more like IOS. Can see OSX for server staying around but looking at the apps for IOS devices from apple and others, think apple will push it on laptop and desktop in the near future (around 5 years). Allot depends on how far apple pushes there processor. Or switches processors again.
 
You and me both, the CPU switch was a HUGE headache for me since I had a not-insignificant amount invested in PPC workstations. Hell, the Quad with the Quadro is still in service. :)

Yeah .. if somehow Apple stayed with PPC, we would have an awesome MacPro. The latest 12 core POWER8 CPU is ~2X faster than the Xeon in the current top end 12 core MacPro and is better at performance/watt as well. POWER8 is 190W TDP.
 
The problem with comments like this is that they look towards the past not the future.

Same API didn't particularly help Windows NT Alpha and MIPS. The core impediment issue isn't just the API (or simple recompile). The issue is large fraction of users aren't going to want to acquire new apps. Using the same stuff they already have is a bigger issue for most users rather that acquiring a new set of stuff in chicken-and-egg situations. Lack of an emulator to scaffold users is goof that Microsoft seems not to grok at all. ( or doesn't particularly care as long as can nudge Intel into following Microsoft's demands. )
In some instance that is true. However Apple has an ace up its sleeve and that is the existing iPad app collection. If they can run these apps in a window and maintain Mac OS's openness they will be successful. In fact I could see having such a repository actually generating more Mac OS development.
.
RT had multiple objectives (which didn't help) and 3rd party Win32 apps may have helped a bit but it was not "the problem".
The problem was and is that the OS sucked, it doesn't matter which hardware it runs on it sucked either way.
That is huge one. When already have/own a major OS on a architecture. Starting another almost always runs into resistance from the established and existing user base.
Didn't hurt the iPad. In fact I see significant hunger out there for alternatives as the bulk of the i86 world has rubbed people the wrong way.
Not necessarily. Different tools for different uses. I don't think there is a deep need to completely merge either iOS with OS X as unified OS any more than a deep need to merge the hardware. iOS and OS X share enough to get some cost efficiencies . iOS and OS X devices could over time share other components ( storage flash , Wifi , etc. ) without necessarily having to merge the CPU/GPU into the intersect pool also to get some cost efficiencies .
I don't understand the merge mantra either. I also don't believe people understand just how much code is already shared between iOS and Mac OS. As far as sharing hardware that is completely possible, at the driver level there are not that many differences to consider.
Frankly while ARM v8 architecture ( that Apple's 7 ) implements is respectable in terms of lower half mainstream app CPU needs, the GPUs bundled with most ARM ( for Apple and much of this ARM SoC discussion they are not just ARM chips. They generally all have bundled GPUs ) are not really competitive with desktop (and top end discrete mobile) GPUs in terms of computational throughput.
This argument is nonsense. Of course they aren't competitive with desktop GPUs, they don't burn 75 watts either. I'm not sure why you even brought this up in the context of low end hardware.
In terms of FLOPs performance the ARM SoC GPUs are way off what the discrete GPUs can provide. There is also little to no advantage they have over Intel's modern iGPUs.
Again you have to look towards the future here, it is most likely that any ARM based laptop Apple introduces will have a new A variant SoC which would mean a new GPU. Such a chip by definition would be faster than previous designs.
Punting to some "super duper" GPU design that Apple piles on top of doing "competitive with Intel desktop class CPU" design task makes little sense when Intel/AMD/Nvidia continue to deliver in the current/upcoming Mac class of performance. OS X (Macs) continues to compete with Windows (and general PC hardware).
It makes sense because the SoC is the printed circuit board of the past, it is where Apple will have to work to innovate into the future. You can't innovate if the hardware is closed to you.

Frankly having access to that silicon is such a huge advantage for Apple that it doesn't even matter if the GPU is slightly slow.
Certainly if the future Intel (and AMD) offerings blow it than an Apple ARM could be an option. But for what is on the likely roadmap for next 2 years Intel's offering is in different leagues and Macs are not drastically retreating on price. Using ARM only gets Macs at lower performance capabilities at the same or higher price than an Intel/AMD system. That is not particularly likely to help grow Mac market share at all.
Huh? How do you figure that ARM wouldn't allow Apple to lower prices significantly? In a nut shell it is a big motivator here as portable hardware becomes cheaper and cheaper. Apple can't compete if its prices are drastically out of line with the rest of the industry. Apples goals with such ARM based devices would be to maintain market share while retaining their margins. An ARM based device could easily shave $300 off the hardware bill of materials.
The new Mac Pro is illustrative of where the bulk of the Mac line up has to be in 4-6 years. ARM SoC solutions don't solve that problem now.
Explain! Because if you are talking cores Apple can add as many as it likes.
They probably won't in 4-6 years either. No ARM SoC can support 6 TB ports or very few even one (since there are few that have sufficient PCIe v2 lane feeders to keep both GPUs and TB feed. )
Come on guy this is looking towards the past again! If Apple wants to support TB on the next A series chip they will design in the support, it is no big deal really. Being that this might be a new concept machine, it might not even have a TB port, USB port or anything else. Apple is completely free do do whatever with this design.
Vendors (and users ) who are entirely committed to just x86 cores complain.

So! It is never wise to be 100% committed to anything. That is what lead the industry to the MS Windows mess.
 
With every new version of OSX, looks more and more like IOS. Can see OSX for server staying around but looking at the apps for IOS devices from apple and others, think apple will push it on laptop and desktop in the near future (around 5 years). Allot depends on how far apple pushes there processor. Or switches processors again.

Actually OS X Server isn't hugely different from regular OS X from what I can tell. I've been using it fairly regularly and I think they will morph it's design along with any redesign they do to OS X. They might put off redesigning the server app a little longer just because reports indicate they needed extra help from iOS designers just to get OS X 10.10 ready in time. So it's not as pressing of a priority.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.