Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

saytheenay

macrumors regular
Original poster
Jul 6, 2012
113
67
My wife needs a new computer and is considering the move from a 2006 IBM Thinkpad T60 (pre-Lenovo acquisition) to an iMac. While we are both disappointed by no 2012 iMacs yet, she needs something soon and if there are no new iMacs by August 1st, we will be going for a 2011 iMac (higher-end BTO).

She will be using the CS6 Standard Suite (Photoshop, Illustrator, and InDesign)—she most likely won’t be doing much video. In addition, I will be setting it up with Windows 7 via VirtualBox and/or Bootcamp so she can regularly use ESRI’s ArcGIS, including a bunch of their add-on modules (from what I understand, the system requirements for ArcGIS are similar to CAD software).

We can live with older hardware—we are more concerned about Apple’s OS X minimum system requirements. Specifically, it seems iMacs have an average shelf life of ~5 years:
- Mountain Lion 10.8 (released 2012), minimum iMac (mid 2007)
- Lion 10.7 (release 2011), minimum iMac (late 2006)
- Snow Leopard (2009), minimum iMac (late 2005)

If we get a 2011 iMac now, does that indicate/mean we won’t be able to run the latest OS X in 3 years vs waiting for the 2012 and getting 5 years out of it? Or is it likely Apple will give it 4-5 years given they are going a year and a half (or longer) before the next refresh?
 
I believe every updates in the future for 2012 machines will also available for 2011 one. Both has 64bit CPUs, and both has Thunderbolt already.

So, from the future-proofing standpoint, I don't see any reason to be worried about 2011 iMac support? Except for minor speed bump, you really don't lose much.
 
Interesting--that makes sense and hopefully proves to be true.

Another question: given the software we will be using, is there any real benefit to having 2 gigs of video RAM, or will we be fine with just 1 gig. Its unlikely we will be playing any heavy duty games on it.
 
Interesting--that makes sense and hopefully proves to be true.

Another question: given the software we will be using, is there any real benefit to having 2 gigs of video RAM, or will we be fine with just 1 gig. Its unlikely we will be playing any heavy duty games on it.

With something like 6970M in current iMac, you'll be fine with 1GB RAM. I had quite silly arguments with some members here regarding this very question.

But I stand to my point, you don't want 2GB VRAM with that mediocre GPU. Unless you don't mind to pay for it.
VRAM is much less essential than system RAM. Only a few application and games can make use of extra RAM in your GPU.

To fill large VRAM takes something massive like 4K resolution, multi display, ultra high texture pack and super fast desktop graphic chips (like GTX680 or Radeon 7970 and beyond) to make use of it. And by the time that happens, I believe 6970M will brought down to its knees. So, it's a moot point.

Metaphorically: Only use race tracks for testing a Ferarri, not scooter.

Hope you get my point.
 
I do get your point. Thanks for your reply and confirming my suspicion that more than a gig is unnecessary. We will be going for at least 8 gigs of RAM, possibly 16, so that should be fine for our needs.
 
To fill large VRAM takes something massive like 4K resolution, multi display, ultra high texture pack and super fast desktop graphic chips (like GTX680 or Radeon 7970 and beyond) to make use of it. And by the time that happens, I believe 6970M will brought down to its knees. So, it's a moot point.

Metaphorically: Only use race tracks for testing a Ferarri, not scooter.

Hope you get my point.

Larger vram is usually more of a requirement for something like computation. I'm not sure about games. The typical argument there is that the fill rate would be insufficient even if you could use more. Say you're talking about CUDA rendering in after effects with their new raytracer (you couldn't use an AMD card for this anyway), the problem size would have to fit within the available vram. The issue is that hardware raytracing usually involves loading the entirety of the scene or objects and addressing it triangle by triangle. Photoshop also suggests 1GB, but their minimum in CS6 is 512. This is for OpenGL/OpenCL actions. In a version or two it will likely go up again.
 
Larger vram is usually more of a requirement for something like computation. I'm not sure about games. The typical argument there is that the fill rate would be insufficient even if you could use more. Say you're talking about CUDA rendering in after effects with their new raytracer (you couldn't use an AMD card for this anyway), the problem size would have to fit within the available vram. The issue is that hardware raytracing usually involves loading the entirety of the scene or objects and addressing it triangle by triangle. Photoshop also suggests 1GB, but their minimum in CS6 is 512. This is for OpenGL/OpenCL actions. In a version or two it will likely go up again.

Agreed it will be useful (even required) for GPU computing tasks or massive gaming.

But then again, in a version or two these old GPUs wouldn't be able to handle those things properly or fast enough, let alone having enormous RAM. So most likely you'll end up buying a new card instead.
 
Agreed it will be useful (even required) for GPU computing tasks or massive gaming.

But then again, in a version or two these old GPUs wouldn't be able to handle those things properly or fast enough, let alone having enormous RAM. So most likely you'll end up buying a new card instead.

There are technically things that can max it out today. Some of the Tesla cards go up to 6GB currently strictly for computation purposes. Obviously that's way beyond what most people would use. Some desktop gpus currently ship with 3GB of ddr5. Your link mentions the HD 7970. Apple often goes a bit skimpy. When things like photoshop (not that heavy of a program if your files aren't gigantic) are suggesting 1GB, it's very silly seeing the mini ship with 256. Ram is an interesting thing in that it generally changes by a factor of 2. 1.5 and 3GB configurations aren't common aside from higher end models when a specific ram type is still expensive.

Games aren't typically the kind of thing that specify vram allocations. They'll just have some obscure card reference for minimum and suggested specs. If it was anything to do with CG and working with high resolution models with a textured view enabled, that could saturate your vram, but it works totally different from games. If that is the point of focus, I wouldn't suggest dipping below 1GB today, and that number may shoot upward regularly.

Edit: for the OP, I wouldn't count on 5 years either way. Things change and Apple is fickle. Ideally the 2012s would already be out and I'd suggest going refurb to save money. While I don't suggest feeding landfills, Apple is too unpredictable, so I don't suggest spending more in favor of future proofing. The chance of hardware failure or early support cutoff is always there.

http://osxdaily.com/2012/02/16/os-x-10-8-mountain-lion-system-requirements/

Note some of the cutoffs. Some of them are cut off at the 2009 mark. The point is do not count on Apple. Adobe, Autodesk, and some of the other large developers also cut support for anything Apple removed from the supported list, yet developers always incur a higher animosity factor. Anyway a good way to do it would be to budget the system carefully and figure it as a 3 year-ish requirement. If you get a longer life from it, enjoy. I just see way too many people max out their Macs at the time of purchase with the idea that they'll last much longer than many of them run or remain supported.
 
Last edited:
My wife needs a new computer and is considering the move from a 2006 IBM Thinkpad T60 (pre-Lenovo acquisition) to an iMac. While we are both disappointed by no 2012 iMacs yet, she needs something soon and if there are no new iMacs by August 1st, we will be going for a 2011 iMac (higher-end BTO).

She will be using the CS6 Standard Suite (Photoshop, Illustrator, and InDesign)—she most likely won’t be doing much video. In addition, I will be setting it up with Windows 7 via VirtualBox and/or Bootcamp so she can regularly use ESRI’s ArcGIS, including a bunch of their add-on modules (from what I understand, the system requirements for ArcGIS are similar to CAD software).

We can live with older hardware—we are more concerned about Apple’s OS X minimum system requirements. Specifically, it seems iMacs have an average shelf life of ~5 years:
- Mountain Lion 10.8 (released 2012), minimum iMac (mid 2007)
- Lion 10.7 (release 2011), minimum iMac (late 2006)
- Snow Leopard (2009), minimum iMac (late 2005)

If we get a 2011 iMac now, does that indicate/mean we won’t be able to run the latest OS X in 3 years vs waiting for the 2012 and getting 5 years out of it? Or is it likely Apple will give it 4-5 years given they are going a year and a half (or longer) before the next refresh?
Another thing to note is that just because an OS upgrade does not support your Mac, it does not mean your Mac will not turn on the next day or become completely obsolete. You can still keep using it.
 
Last edited:
If we get a 2011 iMac now, does that indicate/mean we won’t be able to run the latest OS X in 3 years vs waiting for the 2012 and getting 5 years out of it? Or is it likely Apple will give it 4-5 years given they are going a year and a half (or longer) before the next refresh?

Nobody can tell for sure, but you'll probably get cut off from new OS support eventually.
 
It's the main reason I won't entertain a 2011 iMac, which is a fine well spec-ed computer in most respects.

Apple is becoming increasingly ruthless about cutting off OS support for older Macs. Buy one that's effectively a year old already, and you'll be forced into upgrading a year earlier than you might have anticipated if you want to stay up-to-date down the line. I learned this lesson when my 2006 iMac couldn't run Lion (and therefore iCloud).
 
Another thing to note is that just because an OS upgrade does not support your Mac, it does not make your Mac will not turn on the next day or become completely obsolete. You can still keep using it.

And to further this, as we have seen already with Mountain Lion if you are adamant that you want the latest and greatest OS and it isn't officially supported, it may be able to be forced to run.
 
And to further this, as we have seen already with Mountain Lion if you are adamant that you want the latest and greatest OS and it isn't officially supported, it may be able to be forced to run.

Some things are arbitrarily withheld from earlier OS versions, though. iCloud is a good example. Folk still using Snow Leopard couldn't migrate MobileMe to iCloud without upgrading to Lion. If your Mac couldn't run Lion, you were SOL. There was no technical reason why Snow Leopard couldn't have got iCloud. Hell, every version of Windows did.
 
Another thing to note is that just because an OS upgrade does not support your Mac, it does not make your Mac will not turn on the next day or become completely obsolete. You can still keep using it.

Absolutely right. If you assume 5 years before they drop support for the hardware and then another year plus before you find it necessary to be running the latest version of something, then it's not too bad. That's 6 years or more between hardware updates.
 
You're asking an impossible question. Nobody here can see in to the future. You won't get a definite answer on how long any Mac will be supported for. Apple haven't even decided that yet.

I don't see any reason why a 2011 iMac won't be up to any of those tasks in 5 years though, even if it doesn't support OS X 10.12 Sea Lion or whatever it will be by then.
 
Yet another reason why I'm waiting. Bad enough you have to over pay for 2011 technology but you also have to upgrade a year earlier down the line. So you lose out 3 times. Price, upgrade and features!
 
With something like 6970M in current iMac, you'll be fine with 1GB RAM. I had quite silly arguments with some members here regarding this very question.

But I stand to my point, you don't want 2GB VRAM with that mediocre GPU. Unless you don't mind to pay for it.
VRAM is much less essential than system RAM. Only a few application and games can make use of extra RAM in your GPU.

To fill large VRAM takes something massive like 4K resolution, multi display, ultra high texture pack and super fast desktop graphic chips (like GTX680 or Radeon 7970 and beyond) to make use of it. And by the time that happens, I believe 6970M will brought down to its knees. So, it's a moot point.

Metaphorically: Only use race tracks for testing a Ferarri, not scooter.

Hope you get my point.

Unbelievalble. LMAO!!!!!!
 
Yet another reason why I'm waiting. Bad enough you have to over pay for 2011 technology but you also have to upgrade a year earlier down the line. So you lose out 3 times. Price, upgrade and features!

Bottom line and end of story.
 
No USB3, this is a deal braker for me. You'll get a clear benefit when using inexpensive external drives, like for timemachine or just data transfer. It's a shame they are still not offering that in a machine sold in 2012!
 
"she needs something soon and if there are no new iMacs by August 1st, we will be going for a 2011 iMac (higher-end BTO)."

I suggest you revise your "cutoff date" forward to September 1st.

My guess (only a "guess") is that Apple wants to have an updated iMac "out there" for the back-to-school crowd...
 
My guess (only a "guess") is that Apple wants to have an updated iMac "out there" for the back-to-school crowd...

Based on the supply chain report, what Apple wants and what they're getting are completely different. October 1 is closer to realistic.
 
Well, my wife has been waiting since May, when the rumors were pointing to July. And now they are pointing to September, when they may very well point to October, etc.

She is tired of waiting and I can't blame her. We ordered a fully-loaded 2011 iMac (3.4, SSD+HDD) that should last her another 5-6 years.

Yes, it's older hardware, however, going from a 15" 2006 laptop to a 27" desktop with a huge jump in performance makes it worth it.

When I got my late 2010 MBP in 12/2010, they came out with Sandy Bridge 3 month later. However, I needed a new system right away and my 2004 IBM ThinkPad was literally falling apart. Buy what you need when you need it.
 
I think the only thing which is putting me off buying at the moment is the graphics card. The processor bump and RAM is not as important to me as processors don't change TOO much and RAM can always be upgraded. Screen size and design i couldn't care less about. USB3 i'm not too bothered by either as it has Thunderbolt. But the graphics card is old and as we've seen in Mountain Lion, the graphics card is what pushes a lot of perfectly good Macs otherwise out of the 'Supported Macs' list. Plus i just hate buying technology that i know is already out of date.
 
Well, my wife has been waiting since May, when the rumors were pointing to July. And now they are pointing to September, when they may very well point to October, etc.

She is tired of waiting and I can't blame her. We ordered a fully-loaded 2011 iMac (3.4, SSD+HDD) that should last her another 5-6 years.

Yes, it's older hardware, however, going from a 15" 2006 laptop to a 27" desktop with a huge jump in performance makes it worth it.

When I got my late 2010 MBP in 12/2010, they came out with Sandy Bridge 3 month later. However, I needed a new system right away and my 2004 IBM ThinkPad was literally falling apart. Buy what you need when you need it.

Well done Sir. I have no doubt you will both get great service and enjoyment from you new mac - enjoy!
 
Well, my wife has been waiting since May, when the rumors were pointing to July. And now they are pointing to September, when they may very well point to October, etc.

She is tired of waiting and I can't blame her. We ordered a fully-loaded 2011 iMac (3.4, SSD+HDD) that should last her another 5-6 years.

Yes, it's older hardware, however, going from a 15" 2006 laptop to a 27" desktop with a huge jump in performance makes it worth it.

When I got my late 2010 MBP in 12/2010, they came out with Sandy Bridge 3 month later. However, I needed a new system right away and my 2004 IBM ThinkPad was literally falling apart. Buy what you need when you need it.

Im almost to that point myself. Im going to hold out but this wait is freakin torture. Im anticipating the new hardware - USB3 and video card. Im sure the fully-loaded imac you purchased will definitely hold you over 5 years. Congrats on the purchase.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.