Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
If Apple wanted to create a notebook with an ARM processor, I don't think it would be too difficult to recompile the whole of MacOS X and all the Apple applications for ARM. Third party applications would take a while.

I don't think the power savings from an ARM chip alone would be enough to justify this.

Yes, the third parties are still trying to recoup their investments in the PPC -> x86
transition, followed by the x86 -> x64 (without Carbon) transition.

Apple sometimes make her ISVs run full speed just to stay in the same place.


Wait! iPhone OS is Cocoa touch OSX. All they would need is x86 Emulation/Rosetta and that exists.

Rosetta is Transitive's (now part of IBM - oops) QuickTransit. It does PPC->x86 translation. I can't find any evidence that IBM is enhancing it to do x64->ARM translation.
 
For heavens sake, it has pthreads, it has NSThreads, it has NSOperation, it has everything that MacOS X 10.5 has. The iPhone OS does exactly as much multitasking as Apple wants. And it should be obvious that it is perfectly capable of multitasking as it is, or how do you think does the music player and the phone work at the same time as other applications?

Apple has very valid technical reasons for not allowing multiple applications other than iTunes and the phone to run at the same time. That has been discussed again and again and again. And the iPhone OS _is_ MacOS X with all the unneeded bits removed.

With all the unneeded bits removed? not all! when you have access to, and browse the iPhone OS root file system you will find many files in place which are there cause they are usually there, but totally not optimized, or removed.
 
Apple doesn't need to buy 150 CPU engineers just to make a franken-chip from existing parts.
Maybe you're unsure of what SOC actually is but usually it is exactly how you described it. Most people license an ARM design for the CPU, build their own logic and then license the GPU design or build their own (i.e. nVidia). It isn't called a franken-chip but a system on a chip. If you think that Apple designed their own CPU, GPU, logic chips in under two years I really have no idea what to say. I mean with your logic nVidia (one of the best chip designers in the world) has a "Franken-chip" for their leading mobile solution the Tegra 2 because it uses an ARM CPU design with their GPU/logic design.

I think you fail to understand SOC design and implementation over a single function chip like a CPU or GPU.
 
I'm not that sure how all this works, but I have some idea. To the pros, all the people saying they would like to see this chip in a mac book, would that not be possible because the full blown mac os x runs on intel chips?
 
Maybe you're unsure of what SOC actually is but usually it is exactly how you described it. Most people license an ARM design for the CPU, build their own logic and then license the GPU design or build their own (i.e. nVidia). It isn't called a franken-chip but a system on a chip. If you think that Apple designed their own CPU, GPU, logic chips in under two years I really have no idea what to say. I mean with your logic nVidia (one of the best chip designers in the world) has a "Franken-chip" for their leading mobile solution the Tegra 2 because it uses an ARM CPU design with their GPU/logic design.

If you have a system (like a Dell laptop) with a Broadcomm GbE BCM57xx NIC chip - you have a
system with a dual-core 64-bit MIPS R4000 CPU on the NIC chip.

Same as with ARM, MIPS licenses R4000 "building blocks" for other manufacturers.
 
I am just referring to the original source of this article which claims that A4 uses ARM's Mali GPU. So far there is no information contradicting to this fact.

See:
http://uk.reuters.com/article/idUKLDE60R10320100128?type=companyNews&symbol=ARM.L

"The chip also includes an integrated graphics engine which it is extremely likely based on Imagination Technologies (IMG.L) IP." Imagination rises 3.6 percent.

Apple purchased a significant stake in Imagination Technologies during 2009 - it seems odd that they would do that and then not use the technology in their newest product. Therefore I don't think the claims of the A4 containing the Mali GPU are true.
 
I think you fail to understand SOC design and implementation over a single function chip like a CPU or GPU.

I think a lot of people don't know how chip's are actually designed or implemented. Just about every component for a computer or device has chips made from all different sources... which is a lot more cost effective than doing every transistor yourself.

Yeap, to keep new releases more secret; not based on Intel road-map anymore. However I am not so keen to say that they better than Intel yet; Apple will need time to get there.

Apple has been gone from processor making for so long that at this point, I doubt they'll ever be able to catch up on their own. I mean, you ARE talking about the world's premiere Processor maker + Fabricator... when you consider that their rival, AMD, is basically a half-to-full generation behind right now, and AMD has plenty of experience on their own, I don't see Apple leaving Intel w/ respect to Mac's any time soon, if ever for a long while

(not to mention the whole leaving x86/x64 world would be a huuuuge issue, and Intel isn't going to license out x86 any more either).
 
Fair enough. In what particular way were you underestimated?

Rocketman ;)

I just think it might be a bit early to imply that the A4 is comparable to the snapdragon. It might be, it might not. Also, I would say that the number of people on these products at Qualcomm might be underestimated. Finally, you didn't mention that snapdragon has been out for some time and a sequel is rumored to be on the way soon that might make the A4 look a little last gen:

1) Qualcomm Snapdragon 8X50A, a 45nm version of the current chip and clocked at 1.3GHz. That will begin sampling to manufacturers later in January 2010, with the first products using expected by the end of the year.

2) dual-core Snapdragon, the 8X72, with twin 1.5GHz Scorpion cores, before 2010 is out. According to them the 8X72 will be suitable for both smartphones and smartbooks and capable of 1080p High Definition.

http://www.1800pocketpc.com/2010/01/08/qualcomm-dual-core-1-5ghz-snapdragon-coming-in-2010.html

Just some thoughts...
 
Maybe you're unsure of what SOC actually is but usually it is exactly how you described it. Most people license an ARM design for the CPU, build their own logic and then license the GPU design or build their own (i.e. nVidia). It isn't called a franken-chip but a system on a chip. If you think that Apple designed their own CPU, GPU, logic chips in under two years I really have no idea what to say. I mean with your logic nVidia (one of the best chip designers in the world) has a "Franken-chip" for their leading mobile solution the Tegra 2 because it uses an ARM CPU design with their GPU/logic design.

I think you fail to understand SOC design and implementation over a single function chip like a CPU or GPU.

Most people don't understand that you can get an architecture license from ARM and create their own "clean sheet" implementations of CPU's that are compatible to ARM's chips. Only very few companies in the whole world do an architecture license deal with ARM --- because you need to have a large group of very experienced in-house CPU designers to pull that off.

I cited specific interviews of Qualcomm employees that specifically stated that they are doing their own implementations of the ARMv7-A architecture on the Snapdragon. It's NOT a cortex A8 or A9 core. Did Qualcomm license the GPU from someone else? Yes, I am pretty sure they did.

But I don't consider the Snapdragon a "franken-chip" just because they licensed the GPU from somebody else. Snapdragon uses a Qualcomm designed Scorpion core that implements the ARMv7-A architecture and then combines it with some third party GPU. That's more like Qualcomm body with a prosthetic arm.

My definition of "franken-chip" is that all the important components from the CPU to the GPU are pre-existing. That's clearly not the case with the Snapdragon. And that's very likely to be the case as well for the Apple A4.
 
Apple has been gone from processor making for so long that at this point, I doubt they'll ever be able to catch up on their own. I mean, you ARE talking about the world's premiere Processor maker + Fabricator... when you consider that their rival, AMD, is basically a half-to-full generation behind right now, and AMD has plenty of experience on their own, I don't see Apple leaving Intel w/ respect to Mac's any time soon, if ever for a long while

(not to mention the whole leaving x86/x64 world would be a huuuuge issue, and Intel isn't going to license out x86 any more either).

Why not have the best of both worlds then?

This chip would cover a lot of function needed inside a Mac as much as it does an iPad. Some of those will be double ups to thinks that are covered by the Intel Chips set, some are things Apple would want custom implementations for anyway like trackpad driver, power management, I'm sure the list goes on.

Greater use, better supply cost or faster turn over of inventory. Plus if they use a fuller range of clock speeds then they get to use more of each batch, I'd assume with a custom job like this you still get different quality bins but still have to pay for each full wafer for the fab. Add other parts that could be dropped out adding to cost saving.

On the feature side Apple could have base system boot and many programs run on just this processor only powering the Intel CPU / IGP to suit demand.
Not to mention adding a flash chip or two so the hard drive could well be treated just as a time machine backup drive. With user profiles switched in to the flash drive as they log in.
 
I love that Apple is now using their "own" processor. Not only because it gives them that much more oversight of the "total experience" (which really is what Apple is all about), but because it will make it harder (read: more entertaining) for the haters to bash it (not that they won't try).

Also interesting to note how Apple openly advertises its CPU speeds on its laptop and desktop models (and surprisingly on the iPad), while the competition (which prides itself on hardware spec comparisons) simply tells you their laptop features the "Intel Core2Duo 5200." :confused:

Strange days indeed are upon us.

Damn right, LagunaSol. I agree 100%!!!
 
Sounds like a 1Ghz iPhone could debut this summer with some 4.0 software action. Maybe even a storage bump/price reduction of both the new iPhone hardware and iPad.

That'd make for a nice June.
 
Why not have the best of both worlds then?
If not using Intel chips because suck down major power but give higher performance (not required in this device) .... how could it possibly be 'best'.
If the power consumption and peformance is necessary you don't need the other. Having both just doubles costs. It isn't making it "better".


This chip would cover a lot of function needed inside a Mac as much as it does an iPad.

Huh???? You use a SoC (system on a chip ) in order to reduce the number of chips on your circuit board. Not increase the chip count.

Perhaps Apple could use this as a "Lights out Management" chip. ( Like in Xserves). That's where you put a limited computer inside of another computer that just runs on the trickle charge coming out of the power supply. You use it to control the computer ( even if it is turned off). Historically these have been very low power PowerPC chips running a stripped to minimum linux or embeededOS. Command line only.

I don't think any mainstream computers need Lights Out Management. (other than what is built into the Ethernet , wake-on-lan, and power management. ) The A4 is completely gross overkill for that.


However, as combined in the mainstream chipset with an x86 main CPU it it is just going to get in the way. It isn't going "Help". The major portions consolidating here are CPU/GPU/Memory controller. Those are all things that Intel has clearly stated by intent they want to control. They aren't going to "share" that with anyone. That's why they have cut Nvidia off at the knees. Intel is moving the GPU and/or memory controller to the CPU chip ... just as this A4 chip does. Can't split responsibilities if both folks are doing the same consolidation.

Second Macs are toooooooo small a number for Apple to be doing custom chip work of there own . The only reason why this works for Touch/iPhone/iPad is that Apple is going to get run rates of 10's of millions.
Mac do no where near that. Even if they did they still would be an order of magnitude (or two ) behind what Intel and AMD are doing.

It will not be cost effective. That's one of the problems that Apple had when back on PowerPC. They were doing their own customer system support chips. So the run rate on those chips was just want Apple did. Not what ther overall Personal Computer market was doing. No frakking way going to get similar cost structure when several order(s) of magnitude behind in scale.

The support chips in the PC market are consolidating all by themselves. They aren't going to get big competitive advantage over trying to wrestle against other PC support chip vendors.






On the feature side Apple could have base system boot and many programs run on just this processor only powering the Intel CPU / IGP to suit demand.

This is really two computers inside of one box. Don't see Apple doing that. May get a point where two independent computers are coupled together by Apple ... but they will be seperate.

Not to mention adding a flash chip or two so the hard drive could well be treated just as a time machine backup drive. With user profiles switched in to the flash drive as they log in.

Eh? In part the XServe can use a flash 'stick' as the "OS' drive. The rest I don't see. A backup inside the same box as primary drive is not really a good backup. If the box fries the time machine image may also fry. Safer as an external drive and independent power supply.
 
I'm guessing the A4 is a quad core cortex A9. (1GHz)
... should be comparable to the Tegra2 dual core cortex A9 (2Ghz)



P.
 
I'm guessing the A4 is a quad core cortex A9. (1GHz)
... should be comparable to the Tegra2 dual core cortex A9 (2Ghz)
P.

anyone knows how it compares in costs? when they produce the chips themselves does it make a difference if they put 1,2,4,20 cores there? or is it just getting more expensive when the die is getting bigger and less fit on each waver?
 
Sounds like a 1Ghz iPhone could debut this summer with some 4.0 software action. Maybe even a storage bump/price reduction of both the new iPhone hardware and iPad.

That'd make for a nice June.

This is what I'm hoping for.
 
Apple using pre-designed solutions from ARM and manufacturing chips at the same foundries that Qualcomm and NVIDIA pretty much guaranties that Apple devices will have the same or inferior chips than other phones/tablets. For example, NVIDIA's Tegra2 chip uses the same core (A9) as A4 and has the same frequency. However Tegra2 uses NVIDIAs own GPU design whereas Apple uses standard ARM GPU. One has to assume that Tegra will have the same CPU but better GPU performance than A4.

Most likely the main reason Apple decided to design their own chips is that they wanted to make sure nobody can run their software on generic hardware.
Blah blah blah! The only thing you probably know about a chip is that is has a bunch on transistors and a piece of quartz crystal in it.
And so what if Apple wants to protect it OS. It is theirs. Dell doesn;t have an OS neither does Sony, HP Acer etc. Apple now has chips, OS and a deep talent pool. Not to mention billions and billions of dollars.
Viva Apple. They row their own boat like a big baller. Boooyaaaaaa!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
 
anyone knows how it compares in costs? when they produce the chips themselves does it make a difference if they put 1,2,4,20 cores there? or is it just getting more expensive when the die is getting bigger and less fit on each waver?

it matters for licensing/royalty fees for sure (ARM just increased some of their Cortex -per core- fees.

Also currently the A9 mpcore design allow for a max of 4 core ... or at least according to their web page:
http://www.arm.com/products/CPUs/ARMCortex-A9_MPCore.html

Of course there's probably a bunch of other stuff of the chip as well that increase costs.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.