Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Better support is a given since biulding your own yields no support out of what you yourself provide (and the warranties on each individual part), but how can you actually say that the parts are any better? Apple, just like every other PC company, generally goes for the best pricing per volume they can get. Now, do they typically go with more reputable companies than, say, the likes of Acer? Of course. But companies such as Western Digital, ASUS, Corsair, etc., are all very reputable (and ones that he listed).

The only "red flag" to me on his list was the HIS graphics card, but even then that depends on whether the card is stock or not. If it's stock, it's build to the specifications issued by AMD and retailers like HIS simply slap a HIS sticker on it and push it out the door. Some companies, such as XFX, Sapphire, etc., push out custom cooler designs with variable clock rates, but yeah. And the minimum Radeon 58XX series warranty I've seen was for two years, so even if HIS has a warranty such as that, it's still twice the iMac's default warranty.


Sorry, but your Voodoo2 comment is false. The Voodoo2's big "feature" upon release was dual-texturing, something which Quake II supported (and Quake II's release pre-dates the release of the Voodoo2 cards by 3Dfx). Unreal and Starsiege: Tribes also supported it if I recall, and both of those came out in 1998 as well (the same year the Voodoo2 came out - I remember because people made a big deal over this "new-fangled" technology it was incorporating called SLI). Anyway, the Voodoo2 actually lasted for a considerable amount of time performance-wise, due to Glide's dominance in the late 90s.

As for the Ti 4600, if I recall, it's major "feature" was hardware AA, something that did take games a little while to implement. However, even that notwithstanding, the Ti 4600 was the *top* performing card for a fairly long time, and by a good margin. It was one of those rare cards where you could actually spend a hefty amount, but not feel bad given how long the card lasted (especially given the crap-shoot that was the FX line to follow). Anyone who complains about having owned a Ti 4600 really should just stop.


I'm sorry, but now I know you're just trolling.

i have a friend that built a PC a few years ago with one of those bleeding edge cards that requires one or two dedicated power connections and has a huge fan. he said his electric bills went up $30 that month. i saw the latest Fermi based card and the sucker is like 250W of power. gets so hot you can probably heat your house with it. and the giant fans on it probably sound like airplanes flying over you

the voodoo2 was nice, but rarely anyone did SLI back in the day and Quake 2 ran just fine on one card. Wing Commander Prophecy rocked on Glide compared to my nvidia Riva TNT2 i used as my main card.

i forgot but i think the Ti4600 was the first to have hardware T&L in the card. either way with the voodoo2 and the Ti4600 it was at least 12-18 months before i saw games that listed their features or video RAM as required or recommended requirements.
these days we have youtube and people upload game playing videos and before i chose my laptop with intel HD graphics i saw videos of people playing games that look OK with Intel HD graphics cards. i remember the FUD from Creative and 3DFX when the Voodoo2 first came out about buying the 12MB version or 2 to get SLI.

with iMac's Apple does the little things like very high quality power supplies that are 87PLUS or higher that cost a lot of money on Newegg and Dell/HP don't use in their regular computers. and using components with better capacitors. i remember a few years ago there were issues in the build it community about motherboards dying due to companies saving a penny on cheap capacitors. then they started to sell "premium" motherboards with a dime's worth of better capacitors for an extra $30
 
i have a friend that built a PC a few years ago with one of those bleeding edge cards that requires one or two dedicated power connections and has a huge fan. he said his electric bills went up $30 that month. i saw the latest Fermi based card and the sucker is like 250W of power. gets so hot you can probably heat your house with it. and the giant fans on it probably sound like airplanes flying over you
I'm sure that people running SLI are deeply concerned about their power consumption.

i remember a few years ago there were issues in the build it community about motherboards dying due to companies saving a penny on cheap capacitors. then they started to sell "premium" motherboards with a dime's worth of better capacitors for an extra $30
Never talk about the soldering and thermal paste problems of your Macs.

Bad capacitors hit everyone back in 2004/2005. Apple plays the lowest bidder OEM game like everyone else. You just look at the skin.
 
Wow! Crappy video cards.... AGAIN?!

Seriously?

I find it so humorous that people in this thread are claiming that we're always whining about something. And that Apple can never make us happy.

But please, take note, most every single person complaining about the GPU is *only* complaining about the GPU, and has been complaining for YEARS!

It's the only complaint I ever have with Apple computers. We get great processors, we get decent HDDs and RAM, and a fantastic design and build quality. But my God, are the GPUs always lackluster at best.

(snip) whine whine whine whine whine (snip)

The thing that I find humorous about the "crappy video cards" argument is that the Dell XPS8100, which is the closest comparable machine in options and price to the iMac (and still comes with only a 24" monitor max in a tower+monitor format) still is by-default configured with the Radeon 5450.

In a unit whose design needs to take heat dissipation into account, and is designed to be a middleweight, you can't pack the newest, (literally) hottest GPU into it.
 
Some might say "Out of the fryingpan and into the fire".

We are technically out of the recession, but the US economy is hurting pretty bad right now. While GDP is improving, I would say that the fundamentals of our economy are degrading at a rapid pace.

That doesn't have anything to do with the recession. That's just how the US economy works. More, more and more dept.
 
Does anyone have any Benchmarks for the Core i5 3.60ghz CPU? Its available on the BTO for the 21.5 model - I wouldn't mind seeing how it stacks up against the i3's, and the MBP's.

Thanks!
 
Seriously, no USB 3.0? And no Bluray, of course.

Seriously **** this line of computers until then. Not paying premium if they don't even offer an option for these features.
 
Macs were a bargain in the PowerPC era? Really? I must have missed that. Do you remember waiting 6+ months for a new processor from Moto?

It's certainly hopeless to discuss history here, and you're free to remember only what was important to you. However, there's no doubt that the blue G3 was a bargain, yes. A few G4 QuickSilver models were very nicely priced, and some G5s were very good as well, in terms of great values, starting from relatively low prices.

Sure the waiting was tiresome at times, but there was always a good surprise in the end. Nowadays, there's a slight shift of focus. The company policy doesn't seem to care so much about the lower end anymore. For instance, there's still a cheap MacBook - in order to attract new Mac users -, but there's no cheap iPhone, no more cheap Minis, and no cheap 27"-model. All in all, it looks like the company's leadership has chosen to throw the poorest customers over board as if they weren't needed anymore. While this may look like a reasonable policy from an immediate profit making point of view, I think it's a philosophical change that may be dangerous in the long run.

Just my five cents. Let Steve prove me wrong if he's still the real captain of the ship.
 
At least from Apple's iMac tech specs page, it appears that all models currently support:

"# Mini DisplayPort output with support for DVI, VGA, and dual-link DVI (adapters sold separately); 27-inch models also support input from external DisplayPort sources (adapters sold separately)
# Support for extended desktop and video mirroring modes
# Simultaneously supports full native resolution on the built-in display and up to a 30-inch display (2560 by 1600 pixels) on an external display"

So if I am reading that right it looks like its still only on the 27" model? Thats one thing I wish my 21.5" did, was allow me to connect my desktop to it and just switch sources.
 
It's 26° C in my room and the loudest components in my computer when running Handbrake are the hard drives.
When at load with games, I can hear my Radeon 5970's fan spool up, but I don't use speakers and instead have a nice set of Sennheiser headphones, so once those go on I can't hear a thing usually.

It's 27° C and otherwise, the system is fairly quiet. I can hear the Scythe fans slightly, but "acoustics be damned" when you have so much power squeezed into a Micro ATX case :p

i have a friend that built a PC a few years ago with one of those bleeding edge cards that requires one or two dedicated power connections and has a huge fan. he said his electric bills went up $30 that month. i saw the latest Fermi based card and the sucker is like 250W of power. gets so hot you can probably heat your house with it. and the giant fans on it probably sound like airplanes flying over you
Well, there's no doubt that leaving a computer on 24/7 will increase one's electric bills. That's why, it's best not to (plus, it gets you greater component longevity - it's like a two-fer!).

As someone who has done both SLI and CrossFire, trust me, I know fully well just how much power can be utilized by a system when under full load. But see, that's the thing - you often aren't under full load. I have friends who sometimes keep their systems on to run @Home or such, but that's the exception, not the rule.

All of my systems, Macs and PCs, are set to either sleep after an hour of inactivity, or I actually shut them down (in the case of my gaming system/workstation). Even with a "power efficient" iMac, you'll see a savings by shutting it down when it won't be in use for an extended period. That's simple common sense.

As for FERMI running hot, yes, it does. It's a good thing though that the comparable Radeon 58XX series actually runs cooler than its predecessors. "Yay AMD amirite?"

the voodoo2 was nice, but rarely anyone did SLI back in the day and Quake 2 ran just fine on one card. Wing Commander Prophecy rocked on Glide compared to my nvidia Riva TNT2 i used as my main card.
I'm not disputing that a lot of people didn't use SLI with the Voodoo2, but you said that its main feature (and I should have said features, since SLI did count as another one upon its release) weren't supported by games for over a year, and that wasn't true. SLI was "supported" right away (unlike nVidia and AMD's implementations, 3Dfx's didn't require game support if I recall), and the dual-texturing already was supported by Quake II.

The Voodoo2s were nice, but were more a stepping stone of sorts to what is still one of my favorite GPU series of all time, the Voodoo3 series.

i forgot but i think the Ti4600 was the first to have hardware T&L in the card. either way with the voodoo2 and the Ti4600 it was at least 12-18 months before i saw games that listed their features or video RAM as required or recommended requirements.
But that's the case for ALL cards, even to this day, and that's basically been how it's always been. Games very, very, very rarely ever ship with even a top-of-the-line card as top recommended, because companies know that most people don't have them, and so if people read that a top-model card that costs $600 is what's "recommended", they'll likely sell fewer games.

However, just as it's also always been, buying a top-model card yields you greater performance on games that aren't CPU-limited. A Radeon 5970 is overkill for the likes of Warcraft III, Half-Life II, WoW, etc. It's most certainly not overkill for Crysis/etc. (especially when Eyefinity is being used ;D)


these days we have youtube and people upload game playing videos and before i chose my laptop with intel HD graphics i saw videos of people playing games that look OK with Intel HD graphics cards. i remember the FUD from Creative and 3DFX when the Voodoo2 first came out about buying the 12MB version or 2 to get SLI.
I'm sorry, but I can't imagine any game of the last few years running "ok" with Intel HD graphics. Even WoW, as old as it is, brings that "card" to its knees on anything above low-moderate settings.

with iMac's Apple does the little things like very high quality power supplies that are 87PLUS or higher that cost a lot of money on Newegg
Where are you seeing this? I'm not saying it isn't true, but I've never seen it listed that it's 87plus, and Apple certainly doesn't seem to list any information about it on their site besides calling it "energy efficient".

and Dell/HP don't use in their regular computers. and using components with better capacitors. i remember a few years ago there were issues in the build it community about motherboards dying due to companies saving a penny on cheap capacitors. then they started to sell "premium" motherboards with a dime's worth of better capacitors for an extra $30
You mean like how a few years ago Apple suffered from bad capacitors as well? http://arstechnica.com/civis/viewtopic.php?f=19&t=60930

Also, the capacitor "plague" was limited more to budget mainboards. They're used in the DIY community, yes, but to no great extent. It sucked, but typically it was always covered under warranty (just as Apple covered the issue under warranty above).

I've used a wide range of mainboards, from boards that were cheap (high school), to more premium boards (today, usually always ASUS), and I've never had a board fail on me. It obviously happens, but even a failure here or there wouldn't sway me (in fact, the only failures I've had have been HDs and one GPU which was actually my fault).
 
Other than that, it might have something minor to do with Intel's chipsets not supporting it (I think anyway, correction welcome).
No native support until 2012 is what I heard.

Hi there,

do you have any insight into ETA of a iMac viable 6 core CPU?
Would this release date likely coincide with USB 3 appearance on the iMac?

Take care, cxc
Assuming they stay with Intel, probably 2012 (for 27") or later, assuming there will be 6-core mainstream CPUs then. I don't know if high-end desktop Sandy Bridge (6-8 cores) will get below 130 W, but that'll be a different chipset from the 2-4 core mainstream desktop Sandy Bridge.

and

2013 at the earliest for 8 cores.
 
Wow!

This has got to be the most expensive Apple purchase day for my family – having four children is such a privilege:

2 * 21,5-inch iMac
2 * 27-inch Quad Core iMac
2 * iPhone 4 32GB
2 * Magic Trackpad
2 * iPad 3G 32GB
1 * 15-inch MacBook Pro
1 * 24-inch LED Cinema Display

And that doesn't even include my personal order:

1 * Mac Pro 12-core (32GB with 4 * SSD)
2 * 27-inch LED Cinema Display

Apple was so nice to call me back and confirm that I will be one of the first to receive them, within a two week window from today. Thank you Benny. I mean Apple :D

Can you adopt me!?!?!? Congrats!
 
When at load with games, I can hear my Radeon 5970's fan spool up, but I don't use speakers and instead have a nice set of Sennheiser headphones, so once those go on I can't hear a thing usually.

It's 27° C and otherwise, the system is fairly quiet. I can hear the Scythe fans slightly, but "acoustics be damned" when you have so much power squeezed into a Micro ATX case :p
I'm running a 1 GB GTX 460 now. It's not passive like my older HD 4830 but the difference in GPU and the newfound CUDA/folding@home power is noticeable. 10,000 points per day while folding is impressive to say the least.

GF104 competes heavily with AMD's HD 5800 line up and the full 384 shader version in a GTX 475 form should be coming out soon.

nVidia created an overclocking monster with the GF104. I manage to squeeze out another 1° C going from 715 MHz to 825 MHz. It's 675 MHz at stock too! Why did nVidia set such a miserably low stock speed on this thing?!

No native support until 2012 is what I heard.

Assuming they stay with Intel, probably 2012 (for 27") or later, assuming there will be 6-core mainstream CPUs then. I don't know if high-end desktop Sandy Bridge (6-8 cores) will get below 130 W, but that'll be a different chipset from the 2-4 core mainstream desktop Sandy Bridge.

and
LGA 1155 for Sandy Bridge (hopefully Ivy Bridge as well) appears to be once again limited to 4 cores, 8 threads maximum.

The next mainstream socket is Haswell in 2013. 6-cores bids farewell for a full blown 8-cores.
 
I do video editing, web design, photography using photoshop and apeture, and some light animation.

Torn between the 21.5" and the 27". If I go 27", I might as well go quad core I guess.

But if I go with the 21.5" apple says the i3 is turbo boosted(unlike the base 21.5")....so I was thinking there is no real reason to get the i5 version for $180 more. I mean the diff would seem minor for the above tasks right?

But if I did go with the 27" dual core....would the i5 help since the screen is bigger? Or would it not matter?

Also the 27" wether it's dual or quad core gives you the option for an ATI Radeon 5750 graphics card with 1gb of sdram over the included ATI 5670 with 512mb of sdram. What tasks of the ones I listed above would help the most in?

Lastly...if I went with the 27" without the SSD option, could I add one later to compliment the HDD 1tb drive I'd Get now?

Thanks
 
So if I am reading that right it looks like its still only on the 27" model? Thats one thing I wish my 21.5" did, was allow me to connect my desktop to it and just switch sources.
No, the 27" models also support input from external DisplayPort sources.

From how I read it, even the entry-model 21.5" iMac should have external display support. :)

It's certainly hopeless to discuss history here, and you're free to remember only what was important to you. However, there's no doubt that the blue G3 was a bargain, yes. A few G4 QuickSilver models were very nicely priced, and some G5s were very good as well, in terms of great values, starting from relatively low prices.
I still have a 867 Quicksilver chugging along nicely.

The blue G3 was a fairly decent bargain. When they moved to the G4 though, Apple did begin to up-price everything... I remember when people's best option, for a short-time, was to get a G4 Cube if they wanted a G4, because it was sadly the least-expensive G4 product Apple had. Once the G4 Cube flopped though, Apple lowered the prices of the entry-model PowerMac G4. Even then though, I think the entry price was $1800 or so?

Remember the excitement over a PowerMac G5 at $1500? Yeah, I think that was the cheapest they ever got. Granted it was neutered. :p

Sure the waiting was tiresome at times, but there was always a good surprise in the end. Nowadays, there's a slight shift of focus. The company policy doesn't seem to care so much about the lower end anymore. For instance, there's still a cheap MacBook - in order to attract new Mac users -, but there's no cheap iPhone, no more cheap Minis, and no cheap 27"-model. All in all, it looks like the company's leadership has chosen to throw the poorest customers over board as if they weren't needed anymore. While this may look like a reasonable policy from an immediate profit making point of view, I think it's a philosophical change that may be dangerous in the long run.
Eh... wasn't there a stretch where it was over a year before the G4 went from 450 Mhz to 500 Mhz? People tend to sugar-coat the PowerPC past these days (and I'm a little guilty of it too myself :p), but there were times, especially when the G4 was first coming out, where it was definitely painful to be an Apple customer. Granted this was a time when Apple had to use semi-false advertising with regards to the G4's performance to even have some semblance of competition with Intel and AMD-based systems. The worst had to have been when the Athlon 64 was released - it destroyed everything and anything in its way.

To be fair about the 27" iMac though, it is an IPS panel, and 27" IPS LCDs are fairly expensive (isn't Dell's still around $1000-1100?).


Just my five cents. Let Steve prove me wrong if he's still the real captain of the ship.
Uh, the whole reason Apple is becoming more of a mobile/media device company is because of Steve. :p
 
Sigh

I was talking about 6 not 8. I am not that crazy ;)

You mean 12 then since here's my current 2.8 ghz Corei7 iMac's processor meters:

octomac.jpg


With the Corei7 having 8 threads you get 4 virtual.

With the right apps you can utilize those faux-cores like real ones.

I'm doing it now actually with a number of apps (Boinc, Handbrake, etc).

The Corei7 extreme is a 6-real core chip but I don't think we'll see it in an iMac anytime soon since it's a different socket (need a different motherboard) plus the cost is rather expensive: $999 per chip in single lots.

It would almost need a different case due to the extra heat.
 
The Pound has fallen slightly against the Dollar in the meantime from around $1.60 to $1.55. Hence the slight increase in UK price. It's exactly the same thing that's causing petrol to be so damn expensive at the moment.

Not really the same thing at all. Petrol prices have been primarily affected by the cost of the raw material to the refinery. However, feel free to re-read my comment taking account of the 3% currency fluctuation:

"The 27" i5 has miserly reduction in price from £1687 to £1649. With that you get a processor speed bump of 0.14GHz and a slightly better graphics processor. After nine months there's no measurable fall in the component costs?

If you could also provide data on the processor and memory cost changes over the nine months, that would be of interest.

The fact remains that nine months ago, essentially the same machine could have been bought for less. A 3% currency variance does not excuse this.
 
How does this compare to the 2.8ghz intel core i7 imac I bought on the last refresh?

Minor CPU bump, maybe significant graphics bump. Ability to install a second HD (maybe just SSD though) inside. Slight upgrade to SD card reader too.

Nothing that should make you regret buying a 2009 iMac, at the high end.

At least, that's how I see it (I'm still thrilled with my 27" i7).
 
the xeon 5600's are 32nm just like the i core. and the 5600's just came out in volume a month ago compared to January for the i Core. the 5600's hit the market late April but Dell and HP had them as BTO options on servers and it was a several week wait. starting around June they hit the pre-built systems

you can argue that Intel spent the time perfecting the process to get the yields high enough to release Xeon's based on 32nm. the CPU's were 5600 Xeon's, but the process wasn't mature enough to label them as such so they disabled features and labeled them core i3/5/7. as the process matured they were able to produce "perfect" CPU's that get the Xeon label

and the way Intel manufacturing works is all the new stuff is first made in Oregon where they work out the manufacturing process. then the same process is copied in their other fabs around the world to produce in volume.

thanks for the information
 
When at load with games, I can hear my Radeon 5970's fan spool up, but I don't use speakers and instead have a nice set of Sennheiser headphones, so once those go on I can't hear a thing usually.

It's 27° C and otherwise, the system is fairly quiet. I can hear the Scythe fans slightly, but "acoustics be damned" when you have so much power squeezed into a Micro ATX case :p


Well, there's no doubt that leaving a computer on 24/7 will increase one's electric bills. That's why, it's best not to (plus, it gets you greater component longevity - it's like a two-fer!).

As someone who has done both SLI and CrossFire, trust me, I know fully well just how much power can be utilized by a system when under full load. But see, that's the thing - you often aren't under full load. I have friends who sometimes keep their systems on to run @Home or such, but that's the exception, not the rule.

All of my systems, Macs and PCs, are set to either sleep after an hour of inactivity, or I actually shut them down (in the case of my gaming system/workstation). Even with a "power efficient" iMac, you'll see a savings by shutting it down when it won't be in use for an extended period. That's simple common sense.

As for FERMI running hot, yes, it does. It's a good thing though that the comparable Radeon 58XX series actually runs cooler than its predecessors. "Yay AMD amirite?"


I'm not disputing that a lot of people didn't use SLI with the Voodoo2, but you said that its main feature (and I should have said features, since SLI did count as another one upon its release) weren't supported by games for over a year, and that wasn't true. SLI was "supported" right away (unlike nVidia and AMD's implementations, 3Dfx's didn't require game support if I recall), and the dual-texturing already was supported by Quake II.

The Voodoo2s were nice, but were more a stepping stone of sorts to what is still one of my favorite GPU series of all time, the Voodoo3 series.


But that's the case for ALL cards, even to this day, and that's basically been how it's always been. Games very, very, very rarely ever ship with even a top-of-the-line card as top recommended, because companies know that most people don't have them, and so if people read that a top-model card that costs $600 is what's "recommended", they'll likely sell fewer games.

However, just as it's also always been, buying a top-model card yields you greater performance on games that aren't CPU-limited. A Radeon 5970 is overkill for the likes of Warcraft III, Half-Life II, WoW, etc. It's most certainly not overkill for Crysis/etc. (especially when Eyefinity is being used ;D)



I'm sorry, but I can't imagine any game of the last few years running "ok" with Intel HD graphics. Even WoW, as old as it is, brings that "card" to its knees on anything above low-moderate settings.


Where are you seeing this? I'm not saying it isn't true, but I've never seen it listed that it's 87plus, and Apple certainly doesn't seem to list any information about it on their site besides calling it "energy efficient".


You mean like how a few years ago Apple suffered from bad capacitors as well? http://arstechnica.com/civis/viewtopic.php?f=19&t=60930

Also, the capacitor "plague" was limited more to budget mainboards. They're used in the DIY community, yes, but to no great extent. It sucked, but typically it was always covered under warranty (just as Apple covered the issue under warranty above).

I've used a wide range of mainboards, from boards that were cheap (high school), to more premium boards (today, usually always ASUS), and I've never had a board fail on me. It obviously happens, but even a failure here or there wouldn't sway me (in fact, the only failures I've had have been HDs and one GPU which was actually my fault).

http://images.apple.com/environment...Mac_Product_Environmental_Report_20100727.pdf

the power supplies are 87% efficient. i've looked on newegg and they carry a nice premium over 80% efficient ones
 
I hope these iMac's don't have the screen noise issue that the old model has when you dimmed the screen (a high pitched buzzing noise from the inverter). It was well documented on Apple Support Discussions.

If the new one isn't affected, I'd love to buy one.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.