Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
ECC RAM isn't even that expensive

It's not just the added price of the RAM - it is the need for a "workstation" chipset that supports ECC, and a Xeon CPU that supports ECC.

And if you don't need ECC, that's money down the drain straight into Apple's obscene profit margin.



The graphics card options are silly too, a Geforce GT 120 is an absolutely terrible card based off the Geforce 9500 GT. (For gaming at least)

Yes, the Mac Pro graphics options are craptacular.
 
And more expensive doesn't mean better.

I never claimed it did.

But you can easily beat a $3300 quad Mac Pro for $1700.

Apple's custom upgrade prices suck. That's why I said "stock". I don't see how a comparing a completely different system to a Dell refutes what I said.

Apple needs the desktop system with desktop parts. There's such a huge gap between the all-in-one laptop-on-a-stand and the professional workstation.

Or... that's what some people want.

Fine here is a system that exceeds the mac...

All for a price of $2231 if you account for the slight bump in processor cost.

This system EXCEEDS the $3300 Mac Pro system, in fact if you try to achieve price parity you end up bringing the mac to nearly $3900 via adding a Radeon 4870 and a 2nd 1TB HD.

And amazingly not less than $2000. Just like I claimed. I'm not trying to say Mac Pros are not significantly more expensive that a PC with comparable processor, GPU, and whatever. I simply made one specific claim.

So again why does Apple charge nearly 50% more for their system??

Because people are willing to pay the price for the considerations other than speed.

I like Apples mobile products and I like the current iMacs(Although those could use a drop in price). But charging 50%+ for a system is insane.

That's why they market them as Pro workstations.

And, in general, most stock Macs are a very good deal when they are initially released. Because Apple does not adjust prices as component costs decrease, the value of the Mac decreases significantly over its lifespan.

The problem with the "all the specs" requirement is that it forces someone to include things are may not be important to that person. A lot of people would like a quad core desktop - but they don't need a Xeon workstation and its price.

Apple chooses not to market to everyone. It's all about trade offs

The only significant spec that the Dell that's half the price doesn't meet is ECC memory.

If ECC is a personal requirement - fine. But, since the overwhelming majority of Apple systems sold don't have ECC - one could assume that perhaps ECC isn't a requirement for most people.

Again, your comparison had nothing to do with my claim.

Yes, silly you. A couple of minor options puts the quad Mac Pro well above $3K. Look a my post a couple above - the quad Mac Pro is $3.3K and I didn't even try.

Funny thing about assumptions. Not everyone is thinking the same as you.
 
The Mac Pro prices are reasonable and even good compared to other Xeon-based workstation systems.

The problem is the

. . . . G I A N T . . . G A P I N G . . . H O L E . . .

between the Imac and the Mac Pro - where other companies have many quad core desktop systems, including Core i7 quads that are much faster than the quad Mac Pro for less than half the price.

That HAS always bothered me. Why do they refuse basically to make a consumer tower and opt only for a slower mobile proc in the iMac? I long for the days of the desktop iMac to return...I don't care if it looks like basically a monitor with one cable I want an upgradeable and foremost AFFORDABLE Apple Desktop. The folks that were selling those "fake" Macs a while back before Apple sued them were on the right track...hopefully Apple sees the logic in creating an affordable desktop system and will reveal it in the next keynote presentation.
 
I never claimed it did.

Your statement clearly implied this. You don't say "Value doesn't mean cheaper" except to imply that a more expensive item is better.


Apple's custom upgrade prices suck. That's why I said "stock". I don't see how a comparing a completely different system to a Dell refutes what I said.

I upgraded the CPU by one speed notch, bumped the disk on both to 1 TB, and made the Apple warranty match the 3 year Dell warranty.

If that makes it a "completely different system" - well, just wow.

Now, if I don't try to meet the "$3K Mac Pro" requirement, I get

Mac Pro quad 2.66 GHz $2,499
Dell Studio XPS quad 2.66 GHz $799

and the Apple is more than three times the price of the Dell that is an equivalent value in most tangible aspects.

What's wrong now?
 
Your statement clearly implied this. You don't say "Value doesn't mean cheaper" except to imply that a more expensive item is better.

You inferred something that wasn't implied because you think I'm trying to prove something that I am not trying to prove.

Of course if two things are otherwise equal, the cheaper item is a better value. The problem is that no matter how much you try, you can't get a Windows PC and a Mac PC to be exactly the same. Because one runs Windows and one runs OS X. The value is dependent on how much you value the things that are different.

I upgraded the CPU by one speed notch, bumped the disk on both to 1 TB, and made the Apple warranty match the 3 year Dell warranty.

If that makes it a "completely different system" - well, just wow.

Well, my claim was based on a stock 8-core Mac Pro. So, a customized Quad-core is a completely different system.

Now, if I don't try to meet the "$3K Mac Pro" requirement, I get

Mac Pro quad 2.66 GHz $2,499
Dell Studio XPS quad 2.66 GHz $799

and the Apple is more than three times the price of the Dell that is an equivalent value in most tangible aspects.

What's wrong now?

Nothing is wrong. If you don't value the things that make the Mac Pro different, then you could save a lot of money with a Dell.

Like I said, you seem to think I am trying to prove something that I'm not.
 
That HAS always bothered me. Why do they refuse basically to make a consumer tower and opt only for a slower mobile proc in the iMac?
Apple won't make a mid-range tower because they know it will slaughter their Mac Pro sales. Most people buying Mac Pros aren't doing it because they need a fire-breathing 8-core Xeon with 16GB of RAM, four internal drives and an array of high performance video cards. They're doing it because they need a bit more performance than an iMac, or a machine that doesn't have a screen welded to it and isn't neutered.

Eg: the company I work for buys dozens of Mac Pros for the single reason that we need to attach two (or three in some configurations) specific types ("medically certified" and capable of portrait view) of LCD screens. We'd have no need to buy one again if the mythical "xMac" were ever released.
 
Apple won't make a mid-range tower because they know it will slaughter their Mac Pro sales.

...and decimate Imac sales.

But, even if far fewer Mac Pros are sold, and somewhat fewer Imacs, market share will go up.

Oh well, The Steve doesn't like to give his customers what they think that they want - he knows better.
 
You're still stuck in the past grave you dug yourself.

[*]Implying that the GTX 285 is the most proliferate "gamer" card.

I've invited more realistic suggestions, but the response has been ... silence.

[*]Implying that the majority of desktop users aren't going to be using integrated graphics and using said GTX 285.

Not quite. Items such as this are a risk factor that applies to the population, which results in an increase to the net aggregate average.

For example, if we use the 50W standby (for lack of anything better), if we say 20% of the PC population eventually adopts this sort of option, then there's an increase to the net aggregate PC, which will of course be pro-rated: eg, (50W)*(20% sub-population) = +10W.

The basis of differentiating is because the PC's expandable case is an enabler, whereas the "non-expandability" of the iMac form factor isn't. This is why its expressed as a risk and pro-rated, instead of adding it in as a standard characteristic.


[*]The original outrageous cost of ownership based on power consumption of the iMac vs. a PC desktop

Sorry, but one either gets to disagree with the inputs, or the output, not both.

[*]Using the peak wattage on a power supply as the measure of power consumption.
[/LIST]
I agree that it has its limitations, not the least of which is actual duty cycle. However, this was clearly identified as merely a ROM and there have been several factors mentioned that offset its known high duty cycle.

I am simply not interested in doing a 90 day test trial of n=2000 users distributed across the identifiable demographic subsets in order to get truly "accurate" data.

If you want to, then be my guest.
Or if you can point to a vetted study that did this, I'll read it.

You're not going to be sporting a G92 much less a GTX 285 on a 305W power supply anyways. Apparently our power user ways somehow apply to every PC owner now.

This point has also been already addressed: it is the difficulty in defining what is "average" for the consumer PC. This means that we have a plethora of people doing lots of different things ... including the risk of 20% (or whatever) of the population that are gamers ... that we have to figure out a reasonable net sum average value.

Thus, while every 20" iMac comes with the same sized PSU, over on the PCs, they vary widely, from 250w to over 1000w.

The PC share is a heterogeneous mess that we're trying to KISS down to a simple use case...a net aggregate. So while it may be 100% correct to say that the "average User" is Grandma with a Celeron and 250W PSU, that's also incomplete, because Grandma's grandson uber-enthusiast gameboy also exists and his box has a 800w PSu and between grids and gaming, he's topped it out and is burning far more power.

We're stuck, trying to figure out a guess on the aggregate midpoint.

The statistical distribution is a one-sided tail...and the question is how much weight is in the tail? I don't know, you don't know and neither does anyone else here, since that would only be resolved through data collected from something the ROM size of a 90 day, n=2000 group study.


Lets not even get into energy cost savings due to performance scaling of cores and architectures. Not to mention the Core i7 towers running around with 350 - 400W PSUs that can give the single socket Mac Pro a run for its money.

Agreed. The i7 is clearly a very nice chip, but that's not germane to my point. The point is power consumption, not performance, and its easy to see that the relative scale of the PSUs shipping with the i7 are not dramatically different than last year's status quo, so macroscopically, the i7 has arguably not caused a huge change.


But the Mac Pro has more/better expansion? Guess what the power supply is scaled to that.

And because of that, the Mac Pro has the same upside growth risk as a PC tower.

You've also ignored Aiden's power consumption calculator which can even handle idle productivity, full load, and sleep states in its calculations.

As you know, Aiden is in my killfile.
Because you're not, I went back to find the link.

I see that Dell's calculator's default assumptions include the cost of electrical power at 10% below the US national average. The model has no provisions for lifecycle cost estimations. Its default also assumes that one will only be really pushing the computer for 5 hours/week (250 hours/year). Its definitions are vague. And the tool for setting the "hours per day external display is used" option is ... broken?

It is certainly possible to work through all of these shortcomings to get a SWAG, but personally, I'd not lend a great deal of confidence to it. I'd first go find another tool to cross-compare...such as APS's website for sizing a UPS.


(RE: DISPLAY) No one is going to disagree with you there. The display does have to be included. No reason to even think we're going to somehow come up with some outrageous reason not to include it when the iMac is permanently stuck to its display.

Thanks. Unfortunately, I've not seen where some posters who are throwing out various numbers have actually clearly document that.


I was the original poster, and since my original point was obviously a bit too subtle, I'll state it more plainly.

It is highly unlikely that, even in "peak time", the average PC will draw more than 100W.

But where did you document that this explicitly included a monitor? And did it stipulate what size or which brand of monitor?

Afterall, a 20" LCD alone draws 40-45W when on, so by simple subtraction, this only leaves ~60W for the rest of the PC.


All my numbers are for whole systems, as are the other examples I have seen. You are the only person who seems to think screen power usage is not being included.

My apologies if I accidentally overlooked where you CLEARLY documented this. Perhaps you could be kind enough to provide the cite to the exact Post# where you clearly said this?

Or at least now take a minute to comprehensively detail out what your model is?

If you have actual data to suggest otherwise - and not wild theories based on bad assumptions - please share it.

Not to be rude, but "You First".

You've taken a lot of swings at my simple ROM, but at least I've been open and clearly laid my cards on the table to allow it to be critiqued.


-hh
 
As you know, Aiden is in my killfile.
Because you're not, I went back to find the link.

I see that Dell's calculator's default assumptions include the cost of electrical power at 10% below the US national average.

The price per kilowatt hour is a text box - you can enter whatever the value is for you. (I've been using the same $0.15 as -hh)


The model has no provisions for lifecycle cost estimations.

That's probably why it's called an energy savings calculator.


Its default also assumes that one will only be really pushing the computer for 5 hours/week (250 hours/year).

You misread that - it says "250 work days/year", and 8 hour work days. And again, those are the defaults which can be modified.

If you want, you can set it to "max performance 24/7" - and it still comes nowhere close to your figures.


Its definitions are vague.

The white paper linked from the calculator is anything but vague.


And the tool for setting the "hours per day external display is used" option is ... broken?

"External display" is only if you select a laptop system. If you select a desktop, it figures the monitor usage based on the profile.


It is certainly possible to work through all of these shortcomings to get a SWAG, but personally, I'd not lend a great deal of confidence to it.

It is far more detailed, flexible, and accurate than your power suppy * 24/7 WAG (you don't get the "S", because your estimate is not).


I'd first go find another tool to cross-compare...such as APS's website for sizing a UPS.

LOL. Instead of Dell's measured power consumption and calculator you'd go to a site with a vested interest in overestimating peak power needs and ignoring idle and sleep time?
 
Aiden you're on his ignore list he can't read that. But I am sure that he is missing out.
 
250 watts for peak spikes

For interest's sake, can you load up Prime95 or something similar, and see what the power draw is with the CPU maxed out ?

Bonus points for throwing in a 3D Mark run at the same time to load up the GPU. That should give pretty close to a worse-case power draw scenario for that system.

Dell Studio XPS, Core i7-940 (2.93 GHz), 12 GiB RAM, 1 disk, wifi card, ATI RADEON HD4670 512MB. Monitor not included.

Idle - 89-93 watts, no gadgets, no activity

Web surfing, downloading Prime95,... 105-110 watts

Running 4 threads of Prime95, 215 to 230 watts, saw occasional peaks in the high 240s.

An OpenGL benchmark used about 30 watts above idle, added about 10 watts to Prime95.

So, I couldn't get the average above about 230 watts, and spikes never went above 255.


Aiden you're on his ignore list he can't read that. But I am sure that he is missing out.

I don't think that -hh is reading what anyone is saying here....
 
After reading -hh's replies I'm beginning to wonder.

Aiden what video card do you have on that Studio XPS?

ATI RADEON HD4670 512MB

It has truly abysmal OpenGL performance - I should check that it is running the accelerated OpenGL hardware drivers and not Microsoft's Software OpenGL stack.
 
ATI RADEON HD4670 512MB

It has truly abysmal OpenGL performance - I should check that it is running the accelerated OpenGL hardware drivers and not Microsoft's Software OpenGL stack.
I suspected as much. In most cases you're not going to break 80% continuous draw on an OEM power supply. That's just an offhand guess that will work for most cases.
 
Aiden you're on his ignore list he can't read that. But I am sure that he is missing out.

Or perhaps you have yet to learn, before you vouch for someone else.

For example, other posters have already agreed that it is an unfair comparison to not include the display on the PC.

So please, let's see you provide the citation to where he has previously documented clearly what size external display he has included in his various system's alledgedly 'detailed' power consumption claim(s).


Good luck.


-hh
 
I suspected as much. In most cases you're not going to break 80% continuous draw on an OEM power supply. That's just an offhand guess that will work for most cases.

The measurements are *input* power - the total amount drawn from the wall.

The power supply rating (e.g. "360watt" for the XPS) is *output* power, the DC power available to the computer.

With typical 80% to 90% efficiencies, it's likely that the the average power used for the electronics never exceeded 200watts. (That is, about 56% of the rated capacity.)

Of course, the input power is what determines what you pay the power company, so that's the important number when looking at cost of ownership.
 
The measurements are *input* power - the total amount drawn from the wall.

The power supply rating (e.g. "360watt" for the XPS) is *output* power, the DC power available to the computer.

With typical 80% to 90% efficiencies, it's likely that the the average power used for the electronics never exceeded 200watts. (That is, about 56% of the rated capacity.)

Of course, the input power is what determines what you pay the power company, so that's the important number when looking at cost of ownership.
I should have clarified. That's 80% continuous output. You can break that 80% with occasional peak instances. Of course the input is what you're going to pay and hopefully you're getting better than 80% efficiency in that conversion.
 
So please, let's see you provide the citation to where he has previously documented clearly what size external display he has included in his various system's alledgedly 'detailed' power consumption claim(s).

"It comes up with $36.77 per year for an Optiplex desktop with a 20" LCD panel."

https://forums.macrumors.com/posts/8158443/

The Dell calculator clearly includes the monitor, and I clearly stated which option I chose.

On the other hand, except for the Imac, it is useful to look at the system only. Since the monitor can be connected to different systems - if I use the same monitor with every system then the monitor disappears from the comparison.

I also don't think that anyone disagrees with the idea that an Imac is probably on the low side of system power consumption.

What we're disagreeing with is the idea that an Imac saves the typical user $250/year in electrical bills. That is an absurd claim.


Of course the input is what you're going to pay and hopefully you're getting better than 80% efficiency in that conversion.

The XPS has a Delta DPS-360FB-1A power supply. While I didn't find exactly that model on Delta's site, they do claim 90% efficiency for their DPS line.
 
I've invited more realistic suggestions, but the response has been ... silence.
The response is largely irrelevant. Machines with "gamer" cards - whatever they may be - are atypical and therefore not a reasonable comparison to an iMac, in a discussion about the "average PC".

I agree that it has its limitations, not the least of which is actual duty cycle.

It doesn't have "limitations", it's wrong. The maximum rating of a PSU does not define how much power a given system typically uses.

I am simply not interested in doing a 90 day test trial of n=2000 users distributed across the identifiable demographic subsets in order to get truly "accurate" data.

You don't need to. You just need to let go of your ridiculous assumptions and work with rational ones. You could also try listening to the people who are posting numbers they have recorded both recently and in the past, and looking at tools like Dell's to estimate.

Has it not even occurred to you to rethink your assumptions when your numbers are barely even within the same order of magnitude as everyone else's ?

This point has also been already addressed: it is the difficulty in defining what is "average" for the consumer PC.
There's no difficulty. The "average" PC is going to be just like the "average" Mac because they have nearly identical components and are used in nearly identical ways. Unless you have a good rationale for why PC users are doing dramatically more with their computers ?

This means that we have a plethora of people doing lots of different things ... including the risk of 20% (or whatever) of the population that are gamers ... that we have to figure out a reasonable net sum average value.
If you are including gamers in your power comparisons on the PC side, why are you not including Mac Pro users in your comparison on the Mac side ?

Thus, while every 20" iMac comes with the same sized PSU, over on the PCs, they vary widely, from 250w to over 1000w.
Which doesn't matter, since the size of the PSU does not meaningfully impact how much power the system draws in normal use.

The PC share is a heterogeneous mess that we're trying to KISS down to a simple use case...a net aggregate. So while it may be 100% correct to say that the "average User" is Grandma with a Celeron and 250W PSU, that's also incomplete, because Grandma's grandson uber-enthusiast gameboy also exists and his box has a 800w PSu and between grids and gaming, he's topped it out and is burning far more power.
Then I await your updated calculations to allow for all those people who have maxed-out 8-core Mac Pros.

We're stuck, trying to figure out a guess on the aggregate midpoint.
No. You're stuck because you refuse to consider anything that might disagree with the ******** you've already posted. The rest of us have already identified actual measurements, reasonable estimations, and links to online calculators to validate our numbers and completely invalidate yours.

The statistical distribution is a one-sided tail...and the question is how much weight is in the tail? I don't know, you don't know and neither does anyone else here, since that would only be resolved through data collected from something the ROM size of a 90 day, n=2000 group study.
What rationale can you offer for your assumption that the average PC user is doing substantially more on their PCs, such that the average power draw of said machines is 4-5x that of an iMac, and for substantially longer, such that said power draw should be calculated on a 24/7 basis ?

What rationale can you offer for the distribution not be a good ol' bell curve ?

I see that Dell's calculator's default assumptions include the cost of electrical power at 10% below the US national average. The model has no provisions for lifecycle cost estimations. Its default also assumes that one will only be really pushing the computer for 5 hours/week (250 hours/year). Its definitions are vague. And the tool for setting the "hours per day external display is used" option is ... broken?
I'm guessing you have no idea how hilarious it is to see you attacking Dell's tool for having bad assumptions and calculations.

But where did you document that this explicitly included a monitor? And did it stipulate what size or which brand of monitor?
I'm comparing to an iMac. The reasonable assumption - which is true - is that I would be picking a screen equivalent to the one in the typical iMac - 20".

Afterall, a 20" LCD alone draws 40-45W when on, so by simple subtraction, this only leaves ~60W for the rest of the PC.
And....?

Or at least now take a minute to comprehensively detail out what your model is?
My "model" is actual measurements taken from hundreds of computers - from low-end desktops to 16-core servers - and knowledge of the components that go into a PC and how much power they draw.

Not to be rude, but "You First".
Numerous examples, and a link to at least one calculator have already been provided. If you bother to spend some time searching you'll find several power supply calculators on the web.

You've taken a lot of swings at my simple ROM, but at least I've been open and clearly laid my cards on the table to allow it to be critiqued.
And you have ignored extensive criticism of everything about it and why it is wrong, instead choosing to use the 'lalalalalala' approach and insist it's right.
 
The XPS has a Delta DPS-360FB-1A power supply. While I didn't find exactly that model on Delta's site, they do claim 90% efficiency for their DPS line.
That's pretty good. I hover between 82-86% efficiency on my PSUs and that is going to depend on the load. I'm nowhere near the the maximum continuous output on them though.
 
...and decimate Imac sales.
I think iMac sales would fare much better. The iMac is a solid - if somewhat underspecced/overpriced machine - and has carved out a good niche for itself. Ultimately, most people aren't all that interested in expandabilitity.

In any event, I would expect any "xMac" to be priced such that the xMac+screen cost more than an iMac.
 
I think iMac sales would fare much better. The iMac is a solid - if somewhat underspecced/overpriced machine - and has carved out a good niche for itself. Ultimately, most people aren't all that interested in expandabilitity.

In any event, I would expect any "xMac" to be priced such that the xMac+screen cost more than an iMac.

I meant "decimate" literally - maybe 10% of Imac purchasers would opt for the Xmac. If Apple had reasonable monitor options, that would probably turn out to be more profit for Apple.

The current choice of a $900 monitor and an $1800 monitor isn't reasonable, IMO.
 
Or perhaps you have yet to learn, before you vouch for someone else.

For example, other posters have already agreed that it is an unfair comparison to not include the display on the PC.

So please, let's see you provide the citation to where he has previously documented clearly what size external display he has included in his various system's alledgedly 'detailed' power consumption claim(s).


Good luck.


-hh

Do you have an Apple rod up your arse? I made a general comment, I wasn't referring to any specific points he made, just that I know him to make a lot more sense than most of the posters here. So leave me out of it.
 
Has it not even occurred to you to rethink your assumptions when your numbers are barely even within the same order of magnitude as everyone else's ?

Sure...and when some of the assumptions are using a mere 3% utilization, would not going to even a 30% utilization be very well be expected to result in a full order of magnitude of difference?

As I've previously said, the estimation of utilization is a YMMV factor, and in no small part due to my own interests (since it includes high utilization), I simply assumed 100%. I promptly made this assumption clear - wasn't trying to hide it at all (unlike the 3%).


What rationale can you offer for your assumption that the average PC user is doing substantially more on their PCs, such that the average power draw of said machines is 4-5x that of an iMac, and for substantially longer, such that said power draw should be calculated on a 24/7 basis ?

Breaking this muti-part question down:

  • The average PC user has a higher risk of 'more' because of the inherently greater expandability of the PC's tower configuration (see note).
  • Incorrect: I never claimed that the average power draw of said PC was 4-5x that of an iMac. Fact: it was 2x, based on 200W vs 400W, although 1.5x was also offered ($100/yr was based on 200W vs 300W).
  • Incorrect: never claimed that the average utilization of said PC was substantially longer than that of an iMac. Fact: both were 24/7.

Note: Please keep in mind that we're trying to consider what may very well happen to a machine over its 5-7 year lifecycle. The iMac isn't particularly expandable, so its a lot easier...but how many PC home users can really resist the temptation of leaving it alone? Given all of the upgrade parts that one finds in aisle after aisle after asile in Best Buy, its hard to claim that there's utterly no market for PC upgrades.


What rationale can you offer for the distribution not be a good ol' bell curve ?

Chi Square distribution. Because unlike the SND, the Chi Square positively disallows the possiblity of a negative power consumption.

Plus, note also that as as the variance increases, it incurs an upward shift on that average.


I'm comparing to an iMac. The reasonable assumption - which is true - is that I would be picking a screen equivalent to the one in the typical iMac - 20".

Yes, and I've simply been waiting for you to simply document that, like that. Thank-you.


My "model" is actual measurements taken from hundreds of computers - from low-end desktops to 16-core servers - and knowledge of the components that go into a PC and how much power they draw.

I appreciate the statement, but I don't recall seeing it before, nor its substantiation (did I miss it, or is this the first time its been made?).

FWIW, what I'd find more interesting & relevant would be such values in conjunction with a statistical breakdown of the various "classes" of desktops: the heterogeneous variability factor that I've mentioned.


And you have ignored extensive criticism of everything about it and why it is wrong, instead choosing to use the 'lalalalalala' approach and insist it's right.

When the criticism is out-of-the gate abusively rude, the poster starts an inch from simply being reported to Moderators.

I do understand that we all have tempers, but repeated offensiveness is simply uncalled for, and quite an counterproductive way to get one's point across. Ongoing attempts to redicule and intiminate is simply not acceptable in a civilized society.


-hh
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.