Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

MacRumors

macrumors bot
Original poster
Apr 12, 2001
63,290
30,372



Patently Apple reports that Apple last week filed a curious new trademark application for the term "macroscalar". The company has typically quietly registered trademarks in countries such as Trinidad & Tobago, only to later apply for the marks in the United States and other major markets once the new products and features have been announced. While an application for "macroscalar" was indeed filed in Trinidad & Tobago last August, the new U.S. application and a similar one in Hong Kong are sparking speculation that Apple may have jumped the gun in announcing some new processor technology.
Apple's "Macroscalar" isn't just a new marketing line; it's a processor architecture that's been in the works at Apple since 2004. In fact, Apple owns at least four granted patents on the technology that has yet to come to light. We first covered it in 2009 and briefly twice last year.
macroscalar_trademark_application.jpg



ZDNet published more on Apple's macroscalar architecture last July following one of those patent disclosures, including an explanation of how the technique could be used to improve processor efficiencies by optimizing data-dependent loops.
The macroscalar processor addresses this problem in a new way: at compile-time it generates contingent secondary instructions so when a data-dependent loop completes the next set of instructions are ready to execute. In effect, it loads another pipeline for, say, completing a loop, so the pipeline remains full whether the loop continues or completes. It can also load a set of sequential instructions that run within or between loops, speeding execution as well.
From a user perspective, the technology could support faster performance and lower power consumption, something Apple would definitely be interested in pursing for its mobile devices.
Since Apple provides its own compilers as well as designing CPUs, it is uniquely positioned to offer a complete macroscalar solution to its large band of iOS developers, further widening the price/performance gap between it and the iPad wannabes.

Is it a breakthrough? It could be if the efficiencies it promises can be realized in practice. We'll have to see just how good Apple's compiler engineers are.
While no specifics on Apple's plans have been revealed, the public application for a trademark on the "macroscalar" term is a curious development for the company given that most of its trademarks relate to product and feature names and other promotional descriptions. As a result, speculation suggests that Apple could be preparing to make a significant announcement that will prominently feature the "Macroscalar" term in a similar way to how the company uses "Retina" to describe its high-resolution iPhone and iPod touch displays.

Article Link: Apple's 'Macroscalar' Trademark Application Sparks Speculation on Processor Architecture Advances
 

*LTD*

macrumors G4
Feb 5, 2009
10,703
1
Canada
Sounds like a prelude to some very, very big things - another shift. This is pretty exciting.
 

Dookieman

macrumors 6502
Oct 12, 2009
390
67
Can someone with a better understanding on the topic break this down into laymans terms?
 

fxtech

macrumors 6502
Oct 13, 2008
417
0
Ugh!

ANOTHER processor change? If I have to go through this all over again a forth time I will stop programming and become a barista! :confused:
 

dasmb

macrumors 6502
Jul 12, 2007
369
385
Is it a breakthrough? It could be if the efficiencies it promises can be realized in practice. We'll have to see just how good Apple's compiler engineers are.

Hey let me ask my Altivec unit ...
 

wizard

macrumors 68040
May 29, 2003
3,854
571
No iOS users.

ooh a good bit of news... for mac pro users

Mac Pros will stay i86 for a very long time and I really doubt that Intel would have licensed the IP to Apple to build an enhanced i86 processor.

I'm strongly leaning towards this being either ARM enhancement or is being rolled into a custom GPU. Either approach could offer Apple a significant performance advantage.

It is actually very interesting to read some of Apple CPU patents that have been filed over the years. In some cases it looks like they are trying to design or enhance a processor for the execution of Objective C code. I'm just wondering if this tech will debut in the A6 processor.
 

firewood

macrumors G3
Jul 29, 2003
8,107
1,343
Silicon Valley
Apple don't design CPU's, they design SoC's

They are rumored to have paid a multi-million dollar(pound?) fee for an architecture license from ARM, allowing them to design their own ARM CPUs.

Why would the do that, and then just use someone else's CPU implementations in their SoCs?
 

dasmb

macrumors 6502
Jul 12, 2007
369
385
ANOTHER processor change? If I have to go through this all over again a forth time I will stop programming and become a barista! :confused:

If you, as a programmer, still give two craps what chip your code is running on -- well, you suck. That would also explain why barista is an acceptable move, salary wise.

Abstract, brother, abstract. Use blocks, adapter objects, build lightweight APIs around your process intensive work and use the core APIs wherever possible -- and advances in chips becomes basically free. Stop ice skating uphill.
 

Apple Knowledge Navigator

macrumors 68040
Mar 28, 2010
3,525
11,675
It makes sense that Apple will eventually move away from Intel chipsets to their own. The benefits are in abundance;

- Apple nets more profit with each sale, since they won't have to hand Intel various licensing fees. Everyone in the pro community knows just how much the workstation class CPU's have gone up in price, and that has only inflated the Mac Pro's now ridiculous value. It only seems like 3-years ago that the Mac Pro was around £500 cheaper.

- They can control the release schedules and progress of their own technologies, rather than waiting on Intel to release their products.

- Their chipsets will work independently, and they won't have to compromise performance/features as has occurred between the Intel and Nvidia with their own disputes.

- It adds more control to Apple's products, and they market the chips to their hearts content with fancy names.

- Lastly of course, it means they can introduce new technologies now found in Intel chips or those from other rivals.

It all sounds good, but I can't imagine it'll happen for a while.
 

cmChimera

macrumors 601
Feb 12, 2010
4,271
3,753
It makes sense that Apple will eventually move away from Intel chipsets to their own. The benefits are in abundance;

- Apple nets more profit with each sale, since they won't have to hand Intel various licensing fees. Everyone in the pro community knows just how much the workstation class CPU's have gone up in price, and that has only inflated the Mac Pro's now ridiculous value. It only seems like 3-years ago that the Mac Pro was around £500 cheaper.

- They can control the release schedules and progress of their own technologies, rather than waiting on Intel to release their products.

- Their chipsets will work independently, and they won't have to compromise performance/features as has occurred between the Intel and Nvidia with their own disputes.

- It adds more control to Apple's products, and they market the chips to their hearts content with fancy names.

- Lastly of course, it means they can introduce new technologies now found in Intel chips or those from other rivals.

It all sounds good, but I can't imagine it'll happen for a while.

That doesn't sound good, nor is that likely to be what this is. The biggest problem from switching away from Intel would be the MASSIVE incompatibility problem that would occur.
 

wizard

macrumors 68040
May 29, 2003
3,854
571
Nope, if you follow Apples patent filings over the years it is obvious they are.

Apple don't design CPU's, they design SoC's

Look through Apples filings over the last few years, it is pretty obvious that they have engineers working on hard core logic for CPUs and other devices. At this point the most likely platform to integrate these technologies on would be ARM. Even more so an ARM 64 bit chip.

Taken at that I wonder if these technologies will even make it into A6. The term macroscaler can also be taken in different ways, so they might be focused on how that SoC is out together and like AMD might be trying to come up with a heterogeneous processor. That in and of itself would be a major advancement. Even here though such design depends upon low level logic and certain freedoms with the ISA of the processors involved.
 

dasmb

macrumors 6502
Jul 12, 2007
369
385
It makes sense that Apple will eventually move away from Intel chipsets to their own.

Actually, it only makes sense in embedded systems, where interaction with third party hardware is at a minimum.

The upside in desk/laptop machines of the ability to integrate with so many component manufacturers in part comes from a "bleed up" of cost savings due to the massive number of general components made for intel chipsets. On full custom hardware, prices on chips for those mac pros might go down but all other prices -- graphics, memory, hard drive -- suddenly goes up. This is okay with the iPad, whose product position allows for slightly reduced performance hardware requirements. On a performance machine, it'd likely mean even higher consumer prices, on par with SGI, Solaris or NeXT hardware in the mid 90s (e.g. 3-5 times that of consumer hardware, as opposed to today's 1.5-2x).
 

dasmb

macrumors 6502
Jul 12, 2007
369
385
That doesn't sound good, nor is that likely to be what this is. The biggest problem from switching away from Intel would be the MASSIVE incompatibility problem that would occur.

AMD does okay.

Not great, maybe, but okay. Shoot they won the 64 bit architecture problem.
 

wizard

macrumors 68040
May 29, 2003
3,854
571
Exactly! Very well said and one of the reasons Apple suggest using their tools.

Just as importantly, as a developer you need to learn to read between the lines when it comes to Apples developer tools documentation. It has been pretty obvious over the last few years that Apple has long term goals and that taking their advice ahead of a technologies debut is in your best interest as a developer. One example is Grand Central Dispatch where Apple strongly suggested to developer prior to its announcement to use certain libraries to leverage future threading advancements. The same thing goes for resolution independence.

Now all of this doesn't imply a new CPU architecture anyways. Rather it implies improvements to an existing architecture. My suspicion is ARM with Apple specific improvements. More so some of the patents leave an impression in my mind that Apples goal is to improve the execution of Objective C code.

If you, as a programmer, still give two craps what chip your code is running on -- well, you suck. That would also explain why barista is an acceptable move, salary wise.

Abstract, brother, abstract. Use blocks, adapter objects, build lightweight APIs around your process intensive work and use the core APIs wherever possible -- and advances in chips becomes basically free. Stop ice skating uphill.

Yep! If you don't listen to Apple you end up with sloppy code that doesn't perform. Especially on iOS where many APIs are tied tightly to the hardware. IOS leverages hardware inn ways that few other OS's do.

For the developers that did listen, their apps really benefitted from Apples move to dual core on iOS, that without a lot of refactoring when the hardware came out. These hardware advances will happen no matter what, quad core & heterogenous computing are coming as are other advances. It makes no sense for a programmer to thumb its nose at Apple.
 

milbournosphere

macrumors 6502a
Mar 3, 2009
857
1
San Diego, CA
ANOTHER processor change? If I have to go through this all over again a forth time I will stop programming and become a barista! :confused:
This sounds more like compile-time optimization. Unless they introduce some new API's to help with loop optimization, odds are you won't even notice, as it will be implemented as the compiler translates into some shiny new operands in the object code, not in the Objective C itself.

I studied micro architecture in college and did some work with instruction-level parallelism; I find this very interesting indeed.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.