Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
The Europeans are to blame here for the excessive use of vowels in their languages. Sensible Americans have deleted all these extra vowels to save wear and tear on quils used for writing. The more important modern side effect is that our forefathers have reduced our bandwidth needs relative to the rest of the world. We should be thankful that we had people with such foresight.

I'm saying this as an American!######

There are actually many words where we as a nation have gotten rid of excess vowels to save precious bandwidth. Color is another word that fits the mold.

Any bandwidth savings are outweighed by Americans' penchant for injecting superfluous particles and prepositions where they do not belong.

E.g.

British: Regardless, irrespective
American: Irregardless

British: Not that big a deal, not that much of a deal
American: Not that big of a deal

British: He had got it
American: He had gotten it.

Look at the wasted bandwidth. No wonder we have floods and earthquakes due to global warming.
 
An arm A8x doesn't come closer to the HD5300

You're Right, A8X is about half the GPU performance of HD5300.

IceStorm Unlimited Graphics
A8x = 21776
HD5300 = 45997

Even though Skylake (tock) will also probably launch in 2015 ?

Wrong, Skylake is not coming out until 2016-2017

HD 5300 will be better than Iris pro i think..so huge than hd5000

Wrong,

HD 5300 is weaker than iris pro, hd5300 is weaker than HD5000 in the current Macbook air.

HD5300 < HD5000 < Iris 5100 < Iris Pro 5200 < Iris Pro 6xxx (whatever is gets labeled)

don't be fooled by the naming scheme.

HD5300 will be around the same performance as HD4400 hopefully a little better


I don't see these chips running os X, more iOS.

I think they would be too long in turbo mode to get a good experience with Yosemite.

No, These chips are x86 they will run OS X.

Would have thought the next gen macbook airs would use the 'Broadwell U' processors rather than these?

I think there will be 2 new macbook air series, the new fanless one using these core M, and a continuation of the other series with Broadwell-U.

note: Core M is just a renaming of Broadwell-Y, which is a renaming of Broadwell-ULX.

Could be due to slim design, high 3200x1800 resolution and reduction to 3.5W TDP. Older reviews with 4.5 TDP stated 2.67 points on cinebench and that would be good enough.

I hope these new macbook airs don't have 3200x1800, way too much IMO

Why do so few people understand clock speeds?
When comparing among different processor models, they mean absolutely nothing. Do you really think a first-gen 1.3GHz Intel Atom would be more powerful than this chip? No, of course it isn't.
This isn't necessarily a downgrade - especially given the GPU boost. We'll have to wait and see what the benchmarks are like before declaring it a downgrade, sidegrade or upgrade.

"There's a reason Intel CPUs still destroy ARM CPUs running at the same frequency. Architecture matters more than frequency does."

Do you realize how ironic that is? In the Pentium 4 era Intel was the one bragging about nothing mattered more than clock frequency. Then the the AMD Athlon slaughtered them in performance. And Intel ended up going to small R&D shop in Israel to find an architecture that did not utterly suck.

The main question though is how much performance do you need? I had reason to fire up my 2002 dual processor G4 recently, and it purrs right along. I replaced it with a 2009 mini due to lack of software updates for PPC, not lack of speed with what it had. Diminishing returns has set in. Even on the PC side, they are selling plenty of Celerons, plain Pentiums, and i-3 processors.

By the way, the 2009 mini will be staying online for a bit longer since Apple nerfed the current mini. It's still adequate for the task today, but in another five years it will probably take 4 cores to boot the OS.

Thank you,
People please read and understand the above 2 quotes.

----------

Yeah that's what I'm interested in.

CPUs can gain 10% every generation.

But will the integrated GPUs be powerful enough to drive DOUBLE the pixels in a Retina Macbook Air?

the current Macbook Airs have enough power to run double the pixels.
 
The real problem with people talking about ARM vs x86 is that OS X runs on x86 based architecture, not ARM architecture.
The point is there is nothing about ARM that keeps it from running Mac OS. It is just a matter of recompiling everything for ARM and patching up a few things. That is a bit of an over simplification but the point is Mac OS is and always has been a portable OS that can be moved to different architectures.
They would have to re-engineer the entire OS to run on ARM,
Nope! I'm not even sure why you believe that. There would be some patching up required but the basic Operating System should already be cross platform. Much of the software certainly is. At worst they have to redo some drivers and maybe patch up some loose ends.
and isn't that just iOS?
No, not exactly! Not to confuse matters but iOS is Mac OS with a different UI and a scheduler. As such Mac OS on ARM isn't that far away at all. ThEy would need to rebuild the current Mac UI as an ARM executable. Since it is already widely rumored that they have Mac OS already running on ARM this likely is a done deal. Mac OS on ARM is more of a marketing struggle than a technical struggle.

----------

The GPU performance will be MEH. especially at those low clock rates.

Yeah that was what I was getting at when mentioning the 1.4 GHz Haswell machines Apple sells. I suspect performance will be similar to slightly better which isn't saying much at all.

----------

Thinking back to the days when Apple was chained to the old boat anchor that was Motorola, I find Apple's ability to design CPUs that can challenge Intel absolutely amazing.
Apples A series chips are perhaps the most under recognized part of their innovation profile. The A8X is a fantastic chip and most impressive considering how long Apple has been in the design business.
And you just know there's a locked off area within Apple HQ with OS X running on the A8X. :cool:

That I'm certain of.
 
Even current CPUs are often running on similar frequencies... (IntelSpeedStep). I imagine it as making CPU capable of running in a much wider frequency range dependent on how much power can the system sacrifice for required task. Therefore, it can be saving more when it's possible, and be faster if needed.
 
Last edited:
Not really. It'll scale itself up to either 2 or 2.4GHz if it runs into a demanding application that needs more power.

Here in lies the problem, as apps become more demanding thermal throttling comes into play so you get the opposite which is a roll back of clock rates not an increase. You only get full lock rates for short bursts of performance which does little for truly demanding applications. How aggressive this roll back is, is very dependent upon the thermal characteristics of the implementation.


There is no free lunch here, clock rate demands power. So to leverage the chip you need to turn up you ability to remove heat. This is where a passive design will be most interesting, that is how do they remove the heat without a fan. You can't have the chip heat sinned to the bottom of the case as that is often insulated and just as often sits on some bodies lap. So this means heat sinking to the sides or the top of the case. The top probably gets ruled out due esthetics. So maybe we will see heat sink fins along the back side of the machine.
I know people don't think that 4.5 watts or really 12 watts is much power wise but try this get yourself a small ten watt incandescent build and try to hold it in your hand for an extended period of time. This is just the processor too, the rest of the electronics still generate heat. So a wait and see is in order here, I really doubt this will be an acceptable machine for the majority of users out there. I could be wrong but I just don't think we are there yet for a fanless laptop.

There are interesting technologies out there though. Carbon fiber has been shown to make a nice heat sink.
 
There was no question that these fanless Broadwell systems were going to throttle heavily and fail to be competitive with the current MBA/ultrabook performance expectations. This is essentially a first gen product; The first "real" Intel CPU designed for fanless tablets. It'll be another year or even two before you can get sustained notebook performance from them and of course the active cooled systems of that generation will still be much faster.

Personally I don't get the appeal of a fanless system. I'd rather have some active cooling (which could turn off when the system is under minimal load) and have the option to hit higher sustained performance targets. I suppose there is a weight target (like <1lbs for the iPad Air) at which point any active cooling becomes unreasonable simply because of the extra mass, but beyond that form factor I'd prefer a fan.
 
To a point. You can still swap out any drive, but Apple hasn't made it easy for people to get to, and it voids the warranty.
Actually that is one reason why I've never liked the iMacs.
I'm not sure about the random failure rate for SSDs, but magnetic drives are still as flaky as they've always been. For the two machines Apple still slaps those into, they should be easily accessible, at least for repairs sake.
I don't disagree with this but accessibility isn't as bad as some consumer hardware. Maybe I'm an exception here with several toolboxes full of tools but I don't consider the Mini to be a problem. The IMac on the other hand is just stupid in my mind.

As to SSDs Apple seems to have few problems if any. That is you don't see long threads posted in various forums about SSD problems.
Now this is true, and something I've considered myself. Thunderbolt SSDs should only be a hair slower than a drive hooked directly to the motherboard via PCI-E, and even USB 3.0 is more than fast enough for file storage.
As long as you don't have to put Apps other the user directory off the internal drive performance is acceptable for many use cases. Given that carrying around an external drive to supplement a laptops drives sucks really really bad. I'd love to see a 2TB option for an internal SSD's in Apples laptops. At a reasonable price of course.
Other than making for a slight bit more clutter on your desk, there aren't any real disadvantages to going with an external drive.

Well reliability is an issue For many users though Apples desktop support external drives just fine.
 
You are right. The massive amount of specialized parts in the A8X (many that we don't even know what they are!) is what caught most people by surprise and differentiates Apple more from anyone else. When they bought top notch chip design expertise, it was a crucial move for them; more than we first imagine it would be.

They also have a specialized SOC in their watch too; I wouldn't be surprised if the watch gets release off cycle from the Iphone and Ipad and they could test new processes in those lower volume SOCs.

With the on SOC GPUs and DSPs being used more and more for tasks that are able to be done in parrallel, we are at a point were the CPU is becoming less and less important in the scheme of things. In that regard, Intel is fighting a losing battle since Apple is no longer just fighting to get the CPU faster and the CPU is less and less important to them.

If Apple bought Imagination (And possibly AMD for the ATI/APU architecture which is starting to finally look interesting) and started doing custom GPU's, I think you'd really start to see Intel take notice.

As for Windows applications going to ARM. If develloppers support standard libraries like OpenCL, they mostly don't rely on the CPU to do most of the job anymore. In such a case, those companies can easily port their software to where most of the untapped market is right now : ARM.

Apple's ImgTec License is already a custom GPGPU.

http://www.imgtec.com/investors/detail.asp?ID=836

Imagination Technologies Group plc (LSE: IMG, "Imagination") announces that Apple has extended its multi-year, multi-use license agreement, which gives Apple access to Imagination's wide range of current and future PowerVR graphics and video IP cores.

Under the terms of the above licensing arrangement, Imagination will receive on-going license fees, and royalty revenues on shipment of SoCs (Systems on Chip) incorporating Imagination's IP.

Apple modifies the design for their own Apple A Series SoC.

----------

You are right. The massive amount of specialized parts in the A8X (many that we don't even know what they are!) is what caught most people by surprise and differentiates Apple more from anyone else. When they bought top notch chip design expertise, it was a crucial move for them; more than we first imagine it would be.

They also have a specialized SOC in their watch too; I wouldn't be surprised if the watch gets release off cycle from the Iphone and Ipad and they could test new processes in those lower volume SOCs.

With the on SOC GPUs and DSPs being used more and more for tasks that are able to be done in parrallel, we are at a point were the CPU is becoming less and less important in the scheme of things. In that regard, Intel is fighting a losing battle since Apple is no longer just fighting to get the CPU faster and the CPU is less and less important to them.

If Apple bought Imagination (And possibly AMD for the ATI/APU architecture which is starting to finally look interesting) and started doing custom GPU's, I think you'd really start to see Intel take notice.

As for Windows applications going to ARM. If develloppers support standard libraries like OpenCL, they mostly don't rely on the CPU to do most of the job anymore. In such a case, those companies can easily port their software to where most of the untapped market is right now : ARM.

Apple will never buy AMD because of Anti-Trust requirements, they'd have to license AMD ASICs and produce AMD CPU/APU/Radeon GPGPUs to 3rd party PC vendors.

What Apple should do is deal extend their agreements with AMD for the Radeon series, across all systems as BTO options and fully implement OpenCL 2.x into OS X and iOS.

Then when The FX/APU design post Carrizo [Excavator cores] arrives offer the APU/FX options for their systems.

Apple will then force Intel's hand to open up licensing of Thunderbolt IP to AMD.
 
The point is there is nothing about ARM that keeps it from running Mac OS. It is just a matter of recompiling everything for ARM and patching up a few things. That is a bit of an over simplification but the point is Mac OS is and always has been a portable OS that can be moved to different architectures.

This is not a "bit" of an over simplification. I cant believe this talk is always there when a ARM processor has good specs that everyone is like next year is everything arm.

Lets get this straight, Apple sure has an OSX version compiled for arm but that makes no sense to use it in one product line and certainly not at the time now. Would you totally fragment OSX for a fan-less air? No. ARM could replace here and there some processors but not in the high end mark.

You do realize we currently have x86 compiled binaries? These just dont run on an arm compiled OSX. So we have an OSX which runs on arm with no 3rd party applications.

Microsoft can be lucky that in Apple boards no one remebers or what? What you are asking did microsoft with the Surface. They released the Surface RT and the Surface PRO which was like 1000$ more but ran on x86 processors.

So people bought the RT and there was no software which ran on it except office because no one bothered to recompile their applications for the RT's arm processor.

If that would happen this forum would be full of questions why does my 2015 fanlass ARM air not run <some application>.
 
Hope it comes soon - And in gold. Want a new laptop and with a new design. My MacBook Air is getting old.

But still the most beautiful laptop that exists... :)
 
I have no problem with Apple creating slower, thinner, cooler laptops. Maybe the market exist. Many people use their laptops for simple word processing, facebook, ,and youtube which you can do on an iPad.


What bothers me is that they apply that mentality to the "Pro" laptops too. There are people out there who do not mind if their laptop is going to be .3mm thicker, 0.2 pounds heavier, and will have 6 hours of battery life instead of 10 given that will be a top performer with a current GPU

agree with you. I for one am in the market for a 12 inch fanless 12hr battery notebook that is very light. It should have 750GB SSD and less than $2000 though. It should run Netflix, Youtube, Safari, Microsoft Office, Aperture, Pixelmator, light consumer grade video editing and such. It would be a secondary notebook for traveling next to my work mandated Dell brick. Essentially the iPad is too limited for my uses and this would be a iPad replacement with an open file system and a USB port. Maybe a iPad pro would fit the bill as well. The Surface pro would.

In addition I would always want a 15"MBP that is as fast as it goes. The MBP doesn't need to be super slim or light.
 
IPad Air 2 128gb (or something similar) running OS X sounds good to me. I don't care if its a little slower.
 
Err no! The clock rate is important because they obviously lowered the clock rate to get to a specific TDP figure. Combine that with the lack luster gains in architectural improvements over the last couple of years and this becomes a huge concern. The question then becomes just how bad of a performer is this chip when limited to 4.5 watts, we really don't know but there is enough evidence to indicate that it won't be pretty.

You're right (and know more about this than I do), but I don't mean that this is a fast processor at all. I just wanted to point out that it's not useful to judge a CPU purely based on its clock speed like many people seem to be doing, comparing this to a 90s processor because of the "MHz".
 
Last edited:
I must react to this:
Ad 1) Yes, that's true, however, in Apple's case, MINIMAL configuration available should be EIGHT GB since at least 2013, if not 2012, not 4 GBs.
Well Apple has always had an issue with this. However there are many many applications where a 4GB of RAMmin a Mini would be more than enough. Not for desktop workstation usage but for dedicated uses like a media center PC. Many uses in the corporate world will never use more tha 4GB of RAM either.
Ad 2) It's worth to save something in range 0.1 - 0.5 Watts in system like Mac Mini? For Apple, of course, as they will make more money from RAM upgrades and more frequent updates of hardware, but I'm not certain about customers' opinion to this :) .
That is actually an interesting question. In my case I beleive the reason we didn't get a quad core is due to Apple running out of room with their power budget. People forget that each TB port needs to allocate 10 watts out of the power budget. Combine that with USB demands and phones Intel power ratings and I believe Apple simply didn't have a viable quad core alternative. So a few watts saved here and there might make a difference.

The other thing here is that Apple could have made a massive volume deal for LPDDR3 RAM which made putting this more expensive solution into a Mini feasible.

As for power saved I have no idea what that figure might be, however if you are one to leave a machine powered up all the time every little bit helps.
Ad 3) That's true, mainly for IGPs which use RAM as VRAM. But soldering RAM won't provide significant speed bump , especially if you use lower-memory instead of normal-voltage modules... Energy had to be saved *SOMEWHERE*.
In Haswell the in module RAM can be used that way. In Knights Landing, the next Xeon Phi, they are integrating very fast Memory Cube technology right into the processor module. This is producing an extremely fast memory subsystem which in the initial hardware will be 16 GB of RAM. It probably will take awhile but this tech will work its way into workstation and possibly laptops.

The RAM subsystem is actiually the next area of focus to better performance out of computer systems as RAM has become far too slow to keep modern processors feed. The first step here is the move to DDR 4 which is just starting to happen.
Ad 4) It's not comfortable to always carry some external drive with you... Cloud isn't solution, IMHO.
Believe me this I know. I carry around a big external with my laptop and it is a pain. I also agree that the cloud is a joke. Well at least for bulk storage it is.
Hardware may become cheaper, more energy efficient and slower. But we'll pay more for more optimized software in exchange. We're terribly wasting so many CPU cycles...

Well some people waste a lot of CPU cycles. This rumored laptop might not be a bad machine for them. For those of use that want to move forward it is a big question mark right now. The initial performance figures, power and clock values do not inspire one.

As a side note Broadwell does enhance and add a few instructions which ought to lead to more efficiency and performance once software is updated to leverage those instructions. I don't want my posts to sound like I'm totally negative with respect to Broadwell. It is rather a case that this chip might disappoint in a fanless design. Put a 17 watt Broadwell in a current MBA and we might have one impressive upgrade.
 
... And designing own chips allows Apple transition to 64bit-only CPUs in few years (A7 and higher are in fact nearly two CPUs in one package - difference between 32 bit ARM instruction set and 64 bit instruction set is enormous.
Apple is pushing that with App Store requirements starting next year.
I think that in +- 2018 will Apple release 64bit-only ARM iDevices. It will allow to save some space on die and decrease energy consumption.
It could happen sooner than that. It is really a question of how much advantage they can get from deleting legacy support. I do suspect that their goal is as you describe though, to support 64 bit ARM only.
Intel has tried to make transition to 64bit architecture (if I'm not mistaken, it was called Intel Itanium) ... But it didn't succeed. AMD's i86_64 instruction set was better for transition as personal computers rely on backward compatibility much more than smartphones and tablets.

Actually what I was implying there was to stay with i86_64 but to remove much of the unused capabilities for modes no one uses anymore. It probably should have been the route Atom took as that would have made for a very low transistor count 64 bit chip.

In other words there is a lot of stuff that Intel has to support in hardware and Microcode that simply isn't needed anymore. A break from that is long over due.
 
I have an '11 MBA that does nearly everything I need: It's easy to carry and runs MS Office, Keynote, Mail, Safari, and a few other apps. Its only drawback is power consumption - I have to turn the screen brightness way down to get about 3 1/2 hours of use when I'm flying or otherwise unable to recharge. If Apple releases a laptop that extends battery life even more than the current Haswell MBA and has a retina display, I'm in, even if performance is nearly the same as what I get now.

Agreed. I have the same model as you ('11 MBA). I primarily need only two things in an upgrade: more storage (128gb has been tight for me for quite a while), and longer battery life. Retina screen wouldn't hurt as well, but not a requirement. I have no performance/speed complaints with what I have now.
 
Last edited:
You guys just wait till Apple start to use floppy disks again...

It's revolutionary. Apple will claim to have revolutionized the world by creating magnets capable of storing information on a thin layer of coated plastic.

Maybe in time, Apple will realize the value of its aluminum cases and use magnets to store data on the inside surface of their computers. This will lead to larger cases as larger cases have more surface to hold information and is cheaper than hard drives. This trend will lead us back to more efficient cooling, fewer fans, and upgradable ram.
 
Apple is pushing that with App Store requirements starting next year.

It could happen sooner than that. It is really a question of how much advantage they can get from deleting legacy support. I do suspect that their goal is as you describe though, to support 64 bit ARM only.


Actually what I was implying there was to stay with i86_64 but to remove much of the unused capabilities for modes no one uses anymore. It probably should have been the route Atom took as that would have made for a very low transistor count 64 bit chip.

In other words there is a lot of stuff that Intel has to support in hardware and Microcode that simply isn't needed anymore. A break from that is long over due.

I well, it could be done in earlier, you're right - iPhone 5, 5c and A5 devices are last ones which don't support 64 bit. iOS 9 will support iPhone 5 and higher (2015). If the current life cycle stays the same in next years, then iOS 10 can be 64bit-only (killing last 32 bit-only (A6) devices). That would be 2016 (2 years!)

Why don't Intel remove these? It's not possible because of software... Or do they just obey the rule "don't touch it if it's working"?
 
This is not a "bit" of an over simplification. I cant believe this talk is always there when a ARM processor has good specs that everyone is like next year is everything arm.
The technical end isn't the problem, as I mentioned in another post it is more of a marketing problem.
Lets get this straight, Apple sure has an OSX version compiled for arm but that makes no sense to use it in one product line and certainly not at the time now. Would you totally fragment OSX for a fan-less air? No. ARM could replace here and there some processors but not in the high end mark.
That can be dealt with in various ways. The easiest would be to avoid marketing the devices as a Mac. It could be a more powerful X device where X means a name that isn't "Mac".

Beyond that what is the obsession here with the high end. Nobody is under the illusion that ARM can beat the very best of Intels hardware line up. aRM can however compete very nicely against Intel hardware that goes into lower end laptops. This would allow Apple to market a decent laptop for 100's less.
You do realize we currently have x86 compiled binaries? These just dont run on an arm compiled OSX. So we have an OSX which runs on arm with no 3rd party applications.
You do realize that Apple has the App Store and can compel developers to deliver ARM binaries. Even if they don't do that Apple has thousands of ARM apps on the iOS store right now. All they need is a version of Mac OS that can run those apps in a Window and the problem of software evaporates. They essentially have a larger library of bridging apps already.
Microsoft can be lucky that in Apple boards no one remebers or what? What you are asking did microsoft with the Surface. They released the Surface RT and the Surface PRO which was like 1000$ more but ran on x86 processors.

So people bought the RT and there was no software which ran on it except office because no one bothered to recompile their applications for the RT's arm processor.
Surface failed because of a combination of clunky hardware and a terrible OS. To imply that Apple would release such a terrible combination is just foolish. Surface was dead before it was even released.
If that would happen this forum would be full of questions why does my 2015 fanlass ARM air not run <some application>.
Funny I've yet to hear anybody complaining about the iPad not running xxx piece of software. There is an obvious problem of avoiding user confusion but Apple can address that by creating a new product category that is not a Mac. Think about it, when was the last time you heard an Android tablet user complaining about not being able to run a MS Windows app. Or for that matter a Windows user complaining that a Mac OS app doesn't run on his machine.

I simply don't see where all of these problems come from, maybe an over active imagination or something. Apple has demonstrated that they can get developers onboard for new compelling platforms. An ARM based laptop could be such a product especially if it comes with a compatibility mode for iOS apps.
 
Apple is pushing that with App Store requirements starting next year.

It could happen sooner than that. It is really a question of how much advantage they can get from deleting legacy support. I do suspect that their goal is as you describe though, to support 64 bit ARM only.


Actually what I was implying there was to stay with i86_64 but to remove much of the unused capabilities for modes no one uses anymore. It probably should have been the route Atom took as that would have made for a very low transistor count 64 bit chip.

In other words there is a lot of stuff that Intel has to support in hardware and Microcode that simply isn't needed anymore. A break from that is long over due.

I hope you're not referencing Itanium by Intel and it's instruction set. If so, that's the IA-64 and a complete nightmare.

Glad that never surfaced for consumers.
 
Apple is pushing that with App Store requirements starting next year.

It could happen sooner than that. It is really a question of how much advantage they can get from deleting legacy support. I do suspect that their goal is as you describe though, to support 64 bit ARM only.


Actually what I was implying there was to stay with i86_64 but to remove much of the unused capabilities for modes no one uses anymore. It probably should have been the route Atom took as that would have made for a very low transistor count 64 bit chip.

In other words there is a lot of stuff that Intel has to support in hardware and Microcode that simply isn't needed anymore. A break from that is long over due.

You want them to get rid of legacy cruft? Is that not what Itanium was meant to do?
 
Running Open GL on the Hd 5300 at 900p is around 57 fps
Open GL on the Iris Pro at 900p is around 52 fps
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.