Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Indeed, and with more RAM comes more data, more transactions, and more bits flipping. "Very small in the grand scheme of things" is particularly apt here. Your experience isn't in these kinds of devices based on what you've said previously. The entire iPad consumes less power than just the N270 CPU in the JooJoo, for example. A milliwatt here and there is big.

From what I can understand from my fast talking lecturer, the maximum amount of bits flipping per cycle is up to the bus size. So a 32 wide bus can only flip 32MB(?) per cycle. It doesnt matter about the size of the RAM. The only case of what you're saying might be true is when nVidia had an 8800GTS with 320MB ram and a 320 wide bus.
 
And when you're through with it, the trash can is over there, full of Jobs obsolete iCrap.

Like iPod?
Iphone?
iMac?

:D

Looking in that rubbish bin and all i see is hockey puck mice and the odd apple tv. The rest all seem to be lying on competitors' benches with people wondering how to make something cooler or easier to use.
 
From what I can understand from my fast talking lecturer, the maximum amount of bits flipping per cycle is up to the bus size. So a 32 wide bus can only flip 32MB(?) per cycle. It doesnt matter about the size of the RAM. The only case of what you're saying might be true is when nVidia had an 8800GTS with 320MB ram and a 320 wide bus.

This is correct. His argument seems to be that with more RAM you will switch those bits during a higher percentage of cycles (because, of course, the point of RAM is to use it). But, of course, more RAM just means you are using less Flash memory (or crashing your apps less :)
 
The "constant" power you refer to is for refresh, and it is tiny compared to the read/write power, and probably less than 0.1% of the overall system power.
You're continuing to prove the point. 0.1% is a significant amount of power. You're still using a much larger scale mindset.
This argument makes no sense at all. If your working set is bigger than 256 MB and your program functions, it means you are manually paging to flash RAM (or reading static pages back from flash RAM), which burns MUCH more power than your DRAM power.
You're assuming that this occurs at all on a regular basis (or that it's even enabled in the OS). Paging and swap usage by applications requires a jailbreak on the iPhone.
"Very small in the grand scheme of things" means that the increased power consumption is less than the variation in power consumption caused by varying CPU workload, and thus it has no implications on the overall system design.
Again, that variation not being significant is based on a much larger perspective.
No, a mW here and there is not big. Burning one more mW has no effect on cooling, and reduces battery capacity by 30 seconds.
Again, that is still significant. 30 seconds here and there, a milliwatt here and there, is exactly how you can get an overall device consumption of 2.48W and a battery life of 10+ hours.
No they are not. Being in the same package does NOT make them on the same SoC. A SoC is a system on a CHIP. There are THREE chips in the package: two RAM chips and a single CPU chip that comprises the CPU, GPU, and assorted logic blocks.
Okay, but this doesn't affect the overall point. I should have said package instead of SoC.
My point was not that this is a practical exercise, however. It was that the CPU chip is already capable of driving 512MB of memory (indeed, much more than that), and there is no change required to the CPU chip to accomplish that. (And, again, by "CPU chip" I mean the one chip of the three chips that is not off-the-shelf Samsung DRAM).
Which, again, is not in contradiction with my point.

256MB enabled them to hit all performance targets and necessary tolerances. 512MB would have increased power consumption at idle by nearly 100% for the RAM. It is my understanding from leafing through some Samsung literature from 2008 that each 1Gb DRAM (DDR-S mobile) module consumes approximately 55mW in operation (reads at 80, writes at 74, refresh at 5-65). That's 110mW, or about 20% of the A4 package. That is far from negligible.

No one is saying that 512MB is some far-fetched pipe dream, but only that serious considerations other than saving an amount of money ($11 vs. $16) that amounts to nothing more than a rounding error in the retail unit price are responsible for the decision.
 
You're continuing to prove the point. 0.1% is a significant amount of power. You're still using a much larger scale mindset.

Sigh. No it's not. It's a percentage. It's not an absolute. 0.1% is 0.1%. It's a a few minutes of run time. As I said, the CPU power consumption fluctuates by more than that. Don't presume to tell me my mindset. It's an engineering mindset.

You're assuming that this occurs at all on a regular basis (or that it's even enabled in the OS). Paging and swap usage by applications requires a jailbreak on the iPhone.

No it doesn't. I am an app developer (among other things). I load data into memory from flash and free it when the OS sends me a "memory is full" message. If I receive that message less often, I load things from flash less often.

I'll ignore the rest of your message since it's more of the same.
 
Any modern operating system disagrees with you.

Not even that: Why would you restart the iPad if an app is _not_ properly coded? It's just the same as with MacOS X on a Macintosh. If an application crashes, it's gone. All the memory it has eaten up is free again. All files it opened are closed. All sockets it opened are closed. The operating system will be working just fine. On the iPad the situation is even better, because there is only one application running. So when it exits, there is the operating system running, still in perfect shape, and nothing else.

question, how does the OS handle this? Is it still possible that an application crashing can leave any resource locked? Furthermore, what if an application doesn't crash (or exits), instead, it hangs and consumes system resources to the point of crashing the OS altogether?
 
Specs are irrelevant. It's how it performs that matters, and from the reviews so far it performs really well.


Right. Sort of like if you went to buy a tv and it cost the same ammount as the other tv sets but only got 15 channels. Period. And the manufacturer promised more channels eventually butncant say when.

But you argue that a tv is a tv and they argue that "well this tv promises the best experience because we hand pick the programming" so it's justified that it doesn't have the same as these other tvs.

You would never ever buy that tv.
 
Right. Sort of like if you went to buy a tv and it cost the same ammount as the other tv sets but only got 15 channels. Period. And the manufacturer promised more channels eventually butncant say when.

But you argue that a tv is a tv and they argue that "well this tv promises the best experience because we hand pick the programming" so it's justified that it doesn't have the same as these other tvs.

You would never ever buy that tv.

An excellent analogy for the pitfalls of the walled garden model.
 
Sigh. No it's not. It's a percentage. It's not an absolute. 0.1% is 0.1%.
For one mW, yes. But we're not talking about a single milliwatt difference here. Being generous to the advantages of scale, let's say the 110mW operating consumption of 2Gb will only increase by 50mW when doubling to 4Gb. In any other context, that would be a trivial difference.

But here, that increases the RAM share by 10% on the A4, and thus an approximately 2.5% difference in overall battery life. 15 minutes.
As I said, the CPU power consumption fluctuates by more than that.
...which is built directly into the power consumption averages specified by manufacturers. Notice the RAM figures--operating consumption 55mW; peak consumption 85mW. As this illustrates, power consumption at idle is more important to battery life figures, since all electronics spend most of their time waiting. Since the idle power consumption of RAM is strongly correlated with the amount of RAM, this is important.
Don't presume to tell me my mindset. It's an engineering mindset.
At the moment, it's just an absurdist mindset, perpetuating an argument that does not contradict any established point.

Your "engineering" mindset ignores the reality of engineering on this scale. You have presented exactly nothing to refute the uncontroversial point that considerations other than component cost (again, a price difference near as makes no difference in the final product) were involved in this decision, and that Apple's A4 customizations shaved off amounts of power from the ARM core less than the difference of 1Gb vs. 2Gb RAM modules. If RAM power consumption, inter alia, were insignificant, then so too would be Apple's efforts, since the stock Cortex A8 SoC power consumption is only about double the RAM.

Surely you're bright enough to understand that.
I'll ignore the rest of your message since it's more of the same.
Convenient.
 
For one mW, yes. But we're not talking about a single milliwatt difference here. Being generous to the advantages of scale, let's say the 110mW operating consumption of 2Gb will only increase by 50mW when doubling to 4Gb. In any other context, that would be a trivial difference.

But here, that increases the RAM share by 10% on the A4, and thus an approximately 2.5% difference in overall battery life. 15 minutes.

15 minutes in a system where the battery lasts >11 hours is meaningless. And that's assuming your bogus numbers. In reality, the refresh power is around 1mW, which, to be charitable to your argument, doubles to 2mW. The power for read/write does not increase at all, so it stays at 110mW. (Doubling the size of memory has no effect on the power used to read and write memory. All it affects is leakage current and refresh power). Indeed, although Samsung appears not to have datasheets available (?), Micron has datasheets for what appears to be equivalent LPDRAM that shows standby current increases from 600uA to 660uA when doubling the size from 1Gb to 2Gb. 10% of the standby current, which is a tiny fraction of the read/write current. That's nothing.

Which is built into the power consumption averages specified by manufacturers. The CPU's power consumption at idle is more important to battery life, since all electronics spend most of their time waiting.

So? The point is that how I choose to use the device has far greater an effect on power consumption than the tiny change in refresh power. And the CPU die, itself, burns exactly the same amount of power regardless of how much memory is hooked to it.

I never said they did it save a few bucks, but I dispute strenuously the idea that they had an engineering reason. More likely it had to do with chip supplies, the ability to get die in quantity, and the marketing necessity to have to have a higher-end version in the future.

Edit: by the way, I don't see the internet abuzz about how the 64GB iPad has horrible battery life compared to the 16GB iPad. Why? Because any difference is within the margin of error. There is no such effect. And there is no magical "cliff" in the power vs. memory function between 64GB and 128GB. There is no additional engineering complexity (beyond the obvious addition of an address bit connection). And there will be no such additional complexity until they hit they use up all 32 address bits.
 
I think that this guy makes a good point by comparing it to other devices that focus on one application at a time, like the PS3 or Xbox. It makes much more sense in terms of the performance that they get when you think of it that way.
 
I think that this guy makes a good point by comparing it to other devices that focus on one application at a time, like the PS3 or Xbox. It makes much more sense in terms of the performance that they get when you think of it that way.

FTA

The iPad runs a stripped down version of OS X

No, it's iPhone OS, the exact same thing that runs on iPhone, just scaled up.. FAR from a stripped down version of OS X. Stopped reading after that
 
15 minutes in a system where the battery lasts >11 hours is meaningless.
15 minutes is based on a 10 hour battery life (at 11 hours, it would be closer to 18 minutes), and I'm sorry, but I can't agree that 15-18 minutes is meaningless.
And that's assuming your bogus numbers. In reality, the refresh power is around 1mW, which, to be charitable to your argument, doubles to 2mW.
Standby current is not operating consumption, and I note from Micron's own data sheets that their read/write current is far higher than the Samsung, at 115mW.
Micron has datasheets for what appears to be equivalent LPDRAM that shows standby current increases from 600uA to 660uA when doubling the size from 1Gb to 2Gb. 10% of the standby current, which is a tiny fraction of the read/write current. That's nothing.
You are using DPD figures--i.e. no data retention--the pinnacle of dishonesty. I'm seeing idle refresh consumption at 4mW to 70mW, very much in line with the Samsung data I have previously posted.
So? The point is that how I choose to use the device has far greater an effect on power consumption than the tiny change in refresh power.
No one is arguing that point.
I never said they did it save a few bucks, but I dispute strenuously the idea that they had an engineering reason.
And I dispute strenuously your foreclosing of that notion. Engineering considerations clearly matter end-to-end here, and though supplier availability and production concerns also undoubtedly played a role, I can't seriously believe that Apple would have gone to the trouble to shave down the Cortex A8 while ignoring completely the impact of RAM selection (especially since saving just a hypothetical 10% on the A8 would amount to less than 30mW, something that clearly had value to Apple, and I doubt they had the resources or ability to make much deeper cuts than that).

Edit:
Edit: by the way, I don't see the internet abuzz about how the 64GB iPad has horrible battery life compared to the 16GB iPad. Why? Because any difference is within the margin of error.
More dishonesty. NAND flash and RAM are not comparable--RAM is in constant use, but NAND is read and written only as needed and quite infrequently in the grand scheme of things. NAND is also non-volatile--it does not require active power like RAM to maintain its state. All iPads have two NAND chips, presumably with comparable read/write power requirements, and thus comparable operating power consumption. An iPad constantly seeking from NAND would and does have considerably shorter battery life, regardless of capacity. In fact, along with keeping the wireless active, the constant NAND access is a major reason why an iPad pushed to its limits only lasts about 6 hours.
 
15 minutes is based on a 10 hour battery life (at 11 hours, it would be closer to 18 minutes), and I'm sorry, but I can't agree that 15-18 minutes is meaningless.

Standby current is not operating consumption, and I note from Micron's own data sheets that their read/write current is far higher than the Samsung, at 115mW.

You are using DPD figures--i.e. no data retention. The pinnacle of dishonesty. I'm seeing idle refresh consumption at 4mW to 70mW, very much in line with the Samsung data I have previously posted.

No one is arguing that point.

And I dispute strenuously your foreclosing of that notion. Engineering considerations clearly matter end-to-end here, and though supplier availability and production concerns also undoubtedly played a role, I can't seriously believe that Apple would have gone to the trouble to shave down the Cortex A8 while ignoring completely the impact of RAM selection.

Fact not in evidence, counselor. They did some sort of design work on what appears to be some SoC based on the Cortex A8 block, but there is no actual evidence yet that anything has been "shaved down."
 
FTA



No, it's iPhone OS, the exact same thing that runs on iPhone, just scaled up.. FAR from a stripped down version of OS X. Stopped reading after that

iPhone OS is a stripped down version of OS X. I'd suggest giving it one more go before writing it off.
 
Well you say you knew some people at PA Semi. What do you think they would've done.

In the time they had? Not a heck of a lot. Certainly floorplanning the SoC and packaging. Probably they designed their own I/O cells and maybe PLLs. They probably did their own clock network and designed the clock drivers. They may have done physical design (place & route, standard cell selection - no synthesis), but probably not the entire chip (just not enough time). They may have designed some of the interface blocks, but I'm guessing not the memory controller. It's possible, but unlikely, they designed the SRAM cells for at least the cache.

Next time around you can bet they will be doing the entire physical design except for low speed blocks.
 
and that Apple's A4 customizations shaved off amounts of power from the ARM core less than the difference of 1Gb vs. 2Gb RAM modules. If RAM power consumption, inter alia, were insignificant, then so too would be Apple's efforts, since the stock Cortex A8 SoC power consumption is only about double the RAM.

Apple is building a portfolio of methods and suppliers. They are testing limits on this limited capability device. They will take these elements and make incrementally better devices each iteration. That's what they do. And as for the argument of engineer, physicist, etc, you all miss the point. This is driven by marketing and margins. The rest is a means to an end.

You don't like that, but that's why this post is a monologue.

Rocketman
 
Fact not in evidence, counselor. They did some sort of design work on what appears to be some SoC based on the Cortex A8 block, but there is no actual evidence yet that anything has been "shaved down."
If you want to be technical about it, there's no actual evidence that they're 2Gb RAM modules or that it is definitively Cortex A8-based--the part numbers don't actually exactly match to anything in stock. They're most similar to 2Gb shipping units and the A8, but it hasn't been proven. It's all just informed supposition and hearsay.
And as for the argument of engineer, physicist, etc, you all miss the point. This is driven by marketing and margins. The rest is a means to an end.
On the contrary, that is exactly my point. The difference in margins between 2Gb and 4Gb modules, cmaier and I both essentially agree, is minor. The marketing concerns regarding 256MB vs. 512MB are a wash, leaning perhaps slightly to a disadvantage by going with the smaller amount (quantity vs. efficiency, etc.).

That's precisely why there are technical and/or logistical issues that are relevant here.
 
Looking in that rubbish bin and all i see is hockey puck mice and the odd apple tv.

You're not looking very deeply. The thing is already near full with first-generation iPods. The cheaper something costs, the sooner it is thrown away to be replaced with the next "cooler" version.

The first load of iPhones is merely months away.

Not that many iMacs in there though that have DVD burners, even the oldest ones.

:apple:
 
You're not looking very deeply. The thing is already near full with first-generation iPods. The cheaper something costs, the sooner it is thrown away to be replaced with the next "cooler" version.

The first load of iPhones is merely months away.

Who is going to throw away a working iphone?

Just because something new comes out doesn't mean the old ones are useless, The second hand market is quite robust.

Can't be bothered to work out the residuals - but i'll bet there aren't many 3 year old phones that still have a commercial value.

Same with iPods.
 
Just because something new comes out doesn't mean the old ones are useless, The second hand market is quite robust.

So true, I've had every version of the iPhone, all are still being used by someone in my extended family.

Just because I want the lastest tech doesn't mean everyone has to have it.
 
I like apple products because they age well.

The new generations take the glory - but the older ones still do the job they were designed to do and still look better than most of the other products of a similar age.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.