For f***s sake, x86 is a 40 year old crap architecture that should have died decades ago.
ARM is only targeting a specific subset of servers. They are not targeting servers in general. The target they are aiming at are in the 'front end' cloud server ( e.g., generic web serve ) front ends ) and highly variable load servers. Think hosted virtual machines where the host company makes money by how many limited throughput 32 bit VMs they can pack onto some hardware. Or having "peaker load" servers than can rapidly spin up to handle load bubbles and then spin back down to extremely low power usage when the load evaporates.
Intel is now targeting the same market with their ATOM server offerings. All this has very little to do with where need full time computational horsepower. In short, in parts of server centers power efficiency was becoming a bigger selling point and ARM saw (and still sees ) an opportunity.
True but there are other ways to address the browser performance issues. Install WebKit nightly on an old computer for example here. They now compile JavaScript with an LLVM backend, they are also working on compiling CSS. On my old 2008 MBP the snappiness was very noticeable. Frankly A7 in its current form isn't far from the performance of this machine. By the time this hardware ships I'm not sure many would notice a JavaScript performance issue when upgrading from older hardware.When I wrote about potential issues with single-threaded performance, I was first and foremost thinking about web applications. JavaScript is essentially single threaded yes, you can parallelise some stuff with WebWorkers but most of the code, especially the one which handles the DOM changes, event responses, animation etc., is still going to be single threaded.
We already know that A7 is the market leader in JavaScript performance, in its current market segment. So one can only expect that things will get better.With modern web applications, which heavily rely on advanced DOM manipulation and model binding (such as things built with AngularJS or EmberJS), the single-threaded performance is the deciding factor in perceiving how smooth the website will be. Of course, it might very well be that an A7/A8 is able to execute all the JavaScript fast enough so that no latency is perceivable.
Until this is clear, the single threaded performance remains a concern. As of now, the current MBA is more than twice as fast in browser benchmarks.
no, it wouldn't. The proportion of the total cost of a mba which goes towards the cpu is small to the point of being irrelevant.
Your argument is flawed on many levels. First of all, the benchmark scores are wrong. Looking at the Geekbench 3 results, an iPad Air with A7 reaches 1478 points while a 2635QM reaches 2597 in single-core performance. So the difference is 'just' 2x per core. And Apple has doubled the speed of their ARM CPU at least when we go with Geekbench within last two years. Intel only managed a 10% increase. Continuing the trend, ARM will outperform Intel within few yearsStill, another problem with your argument is that you can't extrapolate the current trends like this. These things are not linear.
----------
It is a different tool with a different purpose in mind. Still, what relevance does the discussion of forking have to what I wrote about web application performance?
Which is baloney! There is nothing about the architecture that keeps it from competing with Intel. In honest comparisons, A7 holds its own against far more robust systems. Given that it does well now there is nothing to keep it from performing much better designed into a similarly robust system.Sure but people on here are getting all heated up that they will REPLACE intel with ARM. It's impossible. The chipset is incapable of the sort of horsepower that intel can do.
BS. You can run any app on ARM that you can run on i86. High end apps just require a similar implementation. In other words more RAM and faster ports. There is nothing about ARM that keeps these ports from happening.Arm is a reduced instruction set which is partly why its more efficient and doesn't have many of the more advanced features of x64 processors. There is almost no way to get current high end apps running on a ARM as is.
Actually there is considerable speculation in the industry that Apple had a lot to do with ARMs 64 bit architecture. They have more than a years head start on working 64 bit hardware for example. As for the GPU, a part that is often underestimated importance wise, Apple has hired a very large number of GPU engineers lately. I wouldn't be surprised to find out that Apple has big plans here.It's the same reason that you cannot just have the way more powerful GPUs that AMD and Nvidia run OSX or any other OS. They are not designed the same way. And don't forget it's not actually apple that defines the A7 it's the ARM company. They adapt the design.
This is just another balerdash rumour.
Nothing like ignorant hyperbole to make your point, eh?That "stall" lasted for years. Maybe we should just stick Power7 in a MBA - i'm sure that would work well.![]()
iOS is the new OSX. Just a matter of time.
Derp.
This is like saying the only thing that keeps McDonalds afloat is their fast food.
No, it wouldn't. The proportion of the total cost of a MBA which goes towards the CPU is small to the point of being irrelevant.
x64 isn't 40 years old, and ARM isn't revolutionary. ...
While what you say is true, your basic assumption is non-PC devices won't become more powerful and evolve, just as the PC has. There was a time when laptops were viewed as an adjunct, not a replacement, for much more powerful desktops. Today, laptops are often the only PC a person uses.
As for tablets, when some of the early ones were very limited. The Newton was neat but even in its 2K incarnation it wasn't still an adjunct to a PC, not a replacement. Tablets, while still an adjunct in many cases, have evolved to the point for many things they can replace a PC. I can watch Netflix on on, stream it to my TV if I want., logon to mainframes or websites and do everything I can on my PC.
PowerPoint, OTOH, isn't yet powerful enough to replace a PC running a presentation. Word, as you point out, is best suited for simple tasks or text entry for later use on the PC version.
As tablets evolve you'll see more and more tasks performed primarily on them; just as laptops started repaving desktops.
You make a valid pitch for your use case. But the retort would be that not everyone has/ or wants a PhD. A vast majority of ma-and-pa kettles who buy computers would have even heard of a citation (other than the one that makes them go to court)
In fact even in 2013; only about a third of the US population has even a bachelor's degree - Educational_attainment_in_the_United_States
The current A7 is designed for extremely low power, it is literally going into cell phones. Apple could easily take the same cores double or more their numbers for use in a desktop machine.It's not flawed. You have to include all the cores. That makes up the whole processor. If Apple can't design a multi-core processor then it'll never catch up to the performance of Intel.
Not exactly a trust worthy site when the page has real technical errors. Given that it tends to support most people's opinions here. Run the A7 at the same clock rates as the current Intel systems with a corresponding RAM array and you have a very competitive platform. This especially considering that more cores can be easily had in the ARM implementations.Since you wan't to compare cores here's the core 2 duo against the A7. Still lags behind.
http://cpuboss.com/cpus/Intel-Core2-Duo-T7400-vs-Apple-A7
There is no reason for them to go into MBPs. People are missing the whole point here which is to offer up a new platform which delivers good performance at a dramatically reduced cost.Apple's A7 and the ARM processor is only good for light tasks. They are good processors for sure but they should never be in anything that requires performance. I can see them maybe put these in the macbook air's but not the macbook pro's where people require performance.
Which is a terrible benchmark to compare cross platform.We'll go by your geekbench website. Single core:
Apple A7 is 1418
a core 2 duo is 1341
Core i7 is 3106
Actually they tell you nothing because you are working with an ARM variant engineered into an entirely different platform. You have no idea what an A7 like processor can do with lots of faster RAM, and an improved secondary storage system.Unfortunately that isn't taking the entire processor into account. Which you have to do to be fair.
Multi-core:
Corei7 12949
A7 2548
Even if Apple had a quad core these benchmarks tell me that Apple's quad-core would still only be at 5000 still lagging behind intel.
So true so why do you compare an A7 designed for a cell phone against a top end Intel chip. At the very least Apple would need four cores in the chip, something easy to obtain. Even more important to your benchmarking is RAM and I/O.Multi-core is where it counts because you typically multitask and you do use all the cores. Not so much on the iphone or tablet because you can only do one thing at a time unlike a computer where you have browsers open, iphoto open, backups running, music playing, video editing, writing a word doc, running vmware fusion, etc...
The A7 will never replace intel in apples lineup. It might be a new product line but it'll never replace it.
"Post-PC" is not so much a gimmick, but far more so appealing to notion that most folks don't separate form factors from actual function very well. "Post PC" is far more the notion are going past the notion that a "PC" looks like the classic IBM "box with slots" from the 80's form factor or even the clamshell laptop of the 90's. That it runs DOS/Windows like apps. The personal computer market is going past being stuck with just those two primary form factors and being benchmarked solely on the OS side against Windows. The notion of a personal computer never was suppose to be stuck to a small subset of form factors.
What the term had grown to mean was a x86+Windows+ classic form factor machine.
That isn't what the notion is about. The origins of the notion of Macintosh was as an appliance more so than a malleable box with slots. Has little to nothing to do with anachronism and more so to do with calcification of terminology.
Those limitation of feature length are far more artifacts of the those specific programs rather than the form factor and/or iOS.
Again not particularly an issue of the new form factors and/or platforms that are new.
Not necessarily. Mac OS made two previous transitions and apps worked.
That is more so a question of whether Apple building a scaffolding emulator or not. Personally I don't think they want to put effort into that (they didn't on the last transition.... they bought access to a solution).
It is impossible to write a 250 page document on an iPad plus keyboard?
Footnotes and references are just a matter of software and ease of entry.
This is kind of a chuckle since when Word first came out folks would same similar things of using Word versus the far more capable (especially for dissertations) TeX system (and other word processing programs they might be familiar with. e.g., Wangs text editor etc. ) [ Even early 90's Framemaker versus the then versions of Word would get very similar issues of cross link, indexing, citiations , etc support where Word "couldn't cut the mustard anywhere near as well". ]
Word has about what around 30 years of development time behind it on the Mac and Word for iPad came out with less than one year ago. Yeah sure they are going to be feature equal because the man years allocated to both is about the same. .... Errrrr not.
One of the primary reasons iOS apps with the same name has less features is because they are younger than their siblings. It isn't an issue of cannot/impossible and far more so of not implemented. "I am not able to do this with this specific version" is dramatically different from cannot be done.
If actually trying to doom or implode Word for iPad one highly successful tactic would be to declare not to ship Word until it was 100% completely feature equal to Window for Windows. Saddle the project with unnecessary complexity and watch it implode under the issues that arise with the finite resources available.
"But Word for iPad can't do it right now"..... Guess what? These ARM based laptops and desktops aren't shipping right now either. Software that hasn't shipped yet really isn't a huge issue for hardware that hasn't shipped yet either. When the first iPad shipped some folks commented about how couldn't do Photoshop and Lightroom on them. Several years later that is not entirely true. It isn't about "cannot" (not capable).
Two programs can't possibly work together on iOS. Not really.
The relatively (to modern CPUs/GPUs) computational requirements of generating a 250 page document with references is one of the primary what the classic PC market is stagnating. An A7 (or future A8) SoC has all the horsepower need to accomplish the task. Can trot out some exhaustive software feature set laundry list and start quibbling over some relatively small subset that is missing but "not capable of " is laughably unmotivated.
The problem the industry is trying to get to grips of is that hardware is far more capable than these limited mainstream workloads.
Can't do localize real time text to speech or some high computational horsepower app perhaps. However, do what software was doing in the early 90's? That is just an implementation priority issue not a capability issue.
It's close to 40 years... the 8086 came out in 1978... and yes the ARM processor isn't exactly young in terms of ancestry either.
Like Windows I see X86 as the best and worst that could have happened to computing, it brought the masses but stifled competition. At one point you had either Intel or an Intel clone, simply because it was the only thing that ran windows and anything other than windows was crap... and imho that was a pretty low benchmark.
For me the success of running a ARM based machine will depend on what current software can I run on it from day one... it's a pretty straight forward measurement... one which I'll use when my current family household of 4 machines is due a refresh.
Actually they tell you nothing because you are working with an ARM variant engineered into an entirely different platform. You have no idea what an A7 like processor can do with lots of faster RAM, and an improved secondary storage system.
So true so why do you compare an A7 designed for a cell phone against a top end Intel chip. At the very least Apple would need four cores in the chip, something easy to obtain. Even more important to your benchmarking is RAM and I/O.
Not likely. OS X is a more advanced OS. It would be the other way around. OS X integrates iOS and you load OS X on your iphone and switch between modes.
You and me both, the CPU switch was a HUGE headache for me since I had a not-insignificant amount invested in PPC workstations. Hell, the Quad with the Quadro is still in service.![]()
In some instance that is true. However Apple has an ace up its sleeve and that is the existing iPad app collection. If they can run these apps in a window and maintain Mac OS's openness they will be successful. In fact I could see having such a repository actually generating more Mac OS development.Same API didn't particularly help Windows NT Alpha and MIPS. The core impediment issue isn't just the API (or simple recompile). The issue is large fraction of users aren't going to want to acquire new apps. Using the same stuff they already have is a bigger issue for most users rather that acquiring a new set of stuff in chicken-and-egg situations. Lack of an emulator to scaffold users is goof that Microsoft seems not to grok at all. ( or doesn't particularly care as long as can nudge Intel into following Microsoft's demands. )
The problem was and is that the OS sucked, it doesn't matter which hardware it runs on it sucked either way..
RT had multiple objectives (which didn't help) and 3rd party Win32 apps may have helped a bit but it was not "the problem".
Didn't hurt the iPad. In fact I see significant hunger out there for alternatives as the bulk of the i86 world has rubbed people the wrong way.That is huge one. When already have/own a major OS on a architecture. Starting another almost always runs into resistance from the established and existing user base.
I don't understand the merge mantra either. I also don't believe people understand just how much code is already shared between iOS and Mac OS. As far as sharing hardware that is completely possible, at the driver level there are not that many differences to consider.Not necessarily. Different tools for different uses. I don't think there is a deep need to completely merge either iOS with OS X as unified OS any more than a deep need to merge the hardware. iOS and OS X share enough to get some cost efficiencies . iOS and OS X devices could over time share other components ( storage flash , Wifi , etc. ) without necessarily having to merge the CPU/GPU into the intersect pool also to get some cost efficiencies .
This argument is nonsense. Of course they aren't competitive with desktop GPUs, they don't burn 75 watts either. I'm not sure why you even brought this up in the context of low end hardware.Frankly while ARM v8 architecture ( that Apple's 7 ) implements is respectable in terms of lower half mainstream app CPU needs, the GPUs bundled with most ARM ( for Apple and much of this ARM SoC discussion they are not just ARM chips. They generally all have bundled GPUs ) are not really competitive with desktop (and top end discrete mobile) GPUs in terms of computational throughput.
Again you have to look towards the future here, it is most likely that any ARM based laptop Apple introduces will have a new A variant SoC which would mean a new GPU. Such a chip by definition would be faster than previous designs.In terms of FLOPs performance the ARM SoC GPUs are way off what the discrete GPUs can provide. There is also little to no advantage they have over Intel's modern iGPUs.
It makes sense because the SoC is the printed circuit board of the past, it is where Apple will have to work to innovate into the future. You can't innovate if the hardware is closed to you.Punting to some "super duper" GPU design that Apple piles on top of doing "competitive with Intel desktop class CPU" design task makes little sense when Intel/AMD/Nvidia continue to deliver in the current/upcoming Mac class of performance. OS X (Macs) continues to compete with Windows (and general PC hardware).
Huh? How do you figure that ARM wouldn't allow Apple to lower prices significantly? In a nut shell it is a big motivator here as portable hardware becomes cheaper and cheaper. Apple can't compete if its prices are drastically out of line with the rest of the industry. Apples goals with such ARM based devices would be to maintain market share while retaining their margins. An ARM based device could easily shave $300 off the hardware bill of materials.Certainly if the future Intel (and AMD) offerings blow it than an Apple ARM could be an option. But for what is on the likely roadmap for next 2 years Intel's offering is in different leagues and Macs are not drastically retreating on price. Using ARM only gets Macs at lower performance capabilities at the same or higher price than an Intel/AMD system. That is not particularly likely to help grow Mac market share at all.
Explain! Because if you are talking cores Apple can add as many as it likes.The new Mac Pro is illustrative of where the bulk of the Mac line up has to be in 4-6 years. ARM SoC solutions don't solve that problem now.
Come on guy this is looking towards the past again! If Apple wants to support TB on the next A series chip they will design in the support, it is no big deal really. Being that this might be a new concept machine, it might not even have a TB port, USB port or anything else. Apple is completely free do do whatever with this design.They probably won't in 4-6 years either. No ARM SoC can support 6 TB ports or very few even one (since there are few that have sufficient PCIe v2 lane feeders to keep both GPUs and TB feed. )
Vendors (and users ) who are entirely committed to just x86 cores complain.
With every new version of OSX, looks more and more like IOS. Can see OSX for server staying around but looking at the apps for IOS devices from apple and others, think apple will push it on laptop and desktop in the near future (around 5 years). Allot depends on how far apple pushes there processor. Or switches processors again.